I gave this talk at Cardozo Law School’s conference on the Internet and Openness , held earlier this year. It was lots of fun and I learned a great deal from the other speakers.
Thank you for this opportunity to speak at this event. I should start out by saying that I do not speak for arstechnica.com here, or anywhere else for that matter. I’m just one voice there, working in the status of contributor for the site.
I’m also not going to stand here in the company of these very informed speakers and represent myself as an expert on the Internet. I’m not. What am I then? Well, occasionally I write something on Ars that somebody finds so unacceptable that they devote an entire blog entry to my inadequacies. Last year one of them angrily denounced me as a “self-appointed FCC watcher,” among other allegedly bad things.
In fairness to this detractor, I have to admit it’s true. That’s what I am: a self-appointed Federal Communications Commission watcher. In my defense, I tried to find an appointment for quite some time, but I’m certainly not going to decline to watch what I’m interested in watching in the absence of one.
Today I want to talk not about the Internet, Openness, and net neutrality per se , but about the challenge of regulating these things at the FCC, indeed, the challenge of regulating any aspect of broadcasting and telecommunications in this time. In fact, I want to talk about time-time itself-because these days it is often something that regulators do not have on their side.
At best they bequeath to the next generation regulations intended for their own time. That is, if they bequeath anything at all.
When I started paying attention to the FCC it was ten years ago and Temple University Press had just published my first book on the Pacifica radio network, and the big deal of the week in my tiny neck of the woods was Low Power FM.
Many of you know this sad story I’m sure. After a decade of persecuting pirate radio stations like Free Radio Berkeley and turning them into saintly martyrs, the agency got wise and got around to authorizing a legal service that would have rolled out over 3,000 ten to 100 watt FM licenses around the country.
The National Association of Broadcasters and National Public Radio ran to Congress and cried signal interference-the last refuge of an incumbent. In an amazing demonstration of broadcaster power, they got Capitol Hill to add a third adjacent channel rule to the FCC’s Order that effectively banned LPFM everywhere except rural America.
But the bill also authorized a study of that alleged interference problem, and the FCC contracted out that study to the MITRE corporation, and three years later MITRE called the NAB’s claims all but bunk. And in 2007 the FCC asked Congress to rewrite its LPFM language sans the 3 rd adjacent chains. And Congressmembers Mike Doyle of Pennsylvania and Lee Terry of Nebraska have a bill to do just that in the House, with the stated support of Ron Paul of Texas and even John McCain in the Senate.
But its ten years later. And the Internet has kicked in with its wonders: Pandora, podcasting, Radio365, Slacker G2, Stitcher, who knows what else? Will any of these innovations provide the local coverage that a community based LPFM offers? I don’t think so. Have they dialed down the urgency of the LPFM question? Yes. I think they have.
We are moving through a time of furiously rapid technological change, accompanied by social movements that rush at whatever media/telecom innovation presents itself, leapfrogging from one thing to the next. Writing this talk I started counting the leapfrogs I’ve seen over the last three decades: newsgroups to bbsses to gopher to Web 1.0 to blogs to myspace to facebook to twitter to whatever.
Within this context titanic regulatory battles take place and when you wake up the next decadal morning you wonder what the heck they were all about. Remember the great media ownership war? A technocolor extravaganza featuring FCC Chairs Michael Powell and then Kevin Martin as Bella Lugosi and Michael Copps Jonathan Adelstein as John Wayne. Tens of thousands of people packing those hearings across the country. A sweeping deregulatory order in 2003. Senate statements of disapproval. A dramatic repudiation from the 3 rd circuit court of appeals. Once more into the breach in 2006 . . .
And finally a single deregulatory order in 2007 allowing cross ownership of newspapers and TV stations in the 20 biggest Nielsen markets-it denounced in the Senate once more.
And now these newspapers are dying and everybody’s trying to figure out how to save them. Some of the blog sites that were created in response to media consolidation are almost as influential as them, and may become them, in fact. And what was it that the consolidators thought they were trying to accomplish, as Clear Channel struggles to unload stations and Tribune files for bankruptcy?
Remember the great cable 70/70 feud of 2005 through 2007? Congress says if the FCC in its annual survey of video competition identifies 70 percent of American households as able to access cable service, and 70 percent buying it, the agency may “promulgate any additional rules necessary to provide diversity of information sources.”
Suddenly with former Kevin Starr assistant Kevin Martin in the cockpit, the cable industry saw a la carte cable on the regulatory horizon and went into no pasaron mode. The second tier of 70/70 has not been reached, cable insisted, it never will be reached, and even if it is reached, it won’t be reached because the FCC doesn’t really have any statutory authority to do anything about its having been reached.
Martin, frustrated by all this, tried to get control of the process with a ham handed reversal of the agency’s own research conclusions about a la carte, at least that’s what Congress determined in its audit of his administration. The rest of the Commission balked at his conclusions, even the Commissioners who thought that the second half of 70/70 had been passed. Finally, the FCC released the 2006 report in January, and announced a catch up proceeding to get up to speed on the state of video in 2007, 2008, and through June of 2009.
This was announced at this month’s Open Commission meeting, which I attended . But it was launched under the shadow of another regulatory event: the FCC’s Notice of Inquiry about that National Broadband Plan it’s going to serve up to Congress by next February, a “transformative” proceeding, now interim Chair Michael Copps has declared.
Which will Copps and likely FCC Chair Julius Genachowski want to focus on, I wonder, a statistical Verdun with dubious regulatory payback, or a “transformative” manifesto on high speed Internet?
As I follow these food fights, I often feel in the end like somebody sitting in a huge empty stadium in the aftermath of some great spectator event. What was that all about? I ask myself, as the technology, its prophets and its adopters move on, reconstructing social stage sets in a single season.
But sometimes regulators do win. The Carterfone decision was an example of such a victory: the FCC’s 1969 decision that consumers may attach any legal device to AT&T’s network that does not harm the network.
But by the time that victory truly manifested itself, the big winners were not Thomas Carter and his little radio attachment on a telephone receiver. It took the FCC years to come to Carterfone, more years for Carter to settle his lawsuit with AT&T, more years for the FCC to defend various aspects of Carterfone in court, more years to set up standards that ended AT&T’s practice of requirement independent device users to buy a special AT&T used attachment for their devices, and more years until AT&T gave up requiring consumers to report independent device purchases to the telco and pay a fee (Kevin Werbach narrated this much better than I ever could at a Santa Clara Law conference held last year).
By then the beneficiaries looked a lot less like Thomas Carter, and a lot more like those gopher and BBS users I remember from way back when. And now, of course, they look like us.
Telecommunications regulations, then, are often time capsules launched into a very uncertain future. They are received by people with very different perspectives using very different machines for very different purposes than the legacy regulators envisioned. And much of the world that they inherit is a world contoured by the failure to reach regulatory consensus in the past, or in time to make a difference. Sirius XM radio’s year and a half merger approval process might be example of this, to some minds.
We have an order approving of unlicensed broadband, or “white space” devices, but the National Association of Broadcasters is suing over it , and the Office of Management and Budget has yet to approve certain key aspects of the process, most notably the databases requirement by the service. We have new metrics for classifying broadband, but last I checked they’re sitting at the OMB as well, waiting for approval based on whether they pass muster with the paperwork reduction act.
So if I’m conveying the idiotically obvious message that regulating this stuff is very hard, does anything make a difference? One lesson that I glean from all this comes from Carterfone and from the present era. Contrary to Oliver Wendel Holmes’ famous rule that first we solve the problem, then we make the principle-outlining principles first seems to clear the way towards a somewhat smoother regulatory path.
That’s why Carterfone is so important. And it’s also why the FCC’s Internet Policy Statement of 2005 has been important. These declarations seem to cut through the foggy future and shine a light of intention on the chaos.
It is unclear to me what the fate of the Internet Policy Statement is. The declaration will surely come up in Comcast’s suit against the FCC’s Order sanctioning it for P2P throttling. Is it a rule, given former FCC Chair Kevin Martin’s ambiguous statements about it, and the fact that it was issued without a proceeding? I don’t know the answers to these questions.
But Congress has hardwired the statement into its Recovery Act, and most of the telecommunications industry has gone on record as recognizing its legitimacy in that context. Strong expressions of principle, sparingly issued but at the right time and in the right place, seem to have a power that less guided regulations do not.
Speaking personally, the openness of the Internet is more than a philosophical question for me. Following the publication of my first book on Pacifica radio, I wrote a second following the organization up to the present. I taught many courses and dutifully wrote academic journal articles and put myself on the job market for many years, while settling into the status of adjunct instructor.
Finally in 2005 I saw that my fate was sealed, stopped writing for the academy, and joined the blogosphere. In late 2007 Arstechnica took me on as a contributor, and I have written over 270 articles for them since.
I doubt that I would be here talking to you in this context were it not for that opportunity. When Matthew Weldon contacted me, I don’t think he had any idea that I have a Ph.D. and teach in the UC system.
There’s a funny New Yorker cartoon from the 1990s showing a cocker spaniel sitting in front of a computer, obviously involved in some sort of chat board thing. “On the Internet,” the animal says, “no one knows you’re a dog.” And, I would add, on the Internet, no one knows that you don’t have tenure.
The openness of the Internet has given many people second chances. It must continue to be allowed to do that. The complexities and challenges of that task should not dissuade us from struggling for consensus in pursuit of that larger goal.