Earlier this week, The Daily Show’s Jon Stewart summed up the debate over net neutrality by stating, “On one side [are] those who want the marketplace to remain a wide open market of ideas, and on the other side [is] a larger group who have no idea what net neutrality means.”
Stewart may have been joking, but he was right about one thing – many folks are confused about what net neutrality actually is and what it would mean for Internet users.
That’s why I decided to enter the America’s Got Net video contest, sponsored by the Open Internet Coalition, a pro-net neutrality trade association. In a short video entitled, “The Open Internet and Lessons from the Ma Bell Era,” I explain how mandating net neutrality would endanger the networks of tomorrow and insulate entrenched firms from competition. Enjoy!
The release of a joint policy framework from Google and Verizon this week touched off even more activity in the never-ending saga of Net Neutrality than the rumors about the possibility such an agreement was in the works did the week before.
Op-ed pages, business and technology news programs, and public radio’s precious moments were overrun with anxious talking heads denouncing or praising the latest developments, or even a few of us trying just to explain what was and was not actually being said and done.
That’s not how August is supposed to be in policyland, when Washington reverts to the swamp from which it came. (John Adams left early one summer during his Presidency and refused to return long after the heat had broken.) I had hoped at long last to get around to finalizing last year’s tax return or maybe fixing my perennially-broken irrigation system, but oh well. Continue reading →
But after going through the framework and comparing it more-or-less line for line with what the FCC proposed back in October, I found there were very few significant differences. Surprisingly, much of the outrage being unleashed against the framework relates to provisions and features that are identical to the FCC’s Notice of Proposed Rulemaking (NPRM), which of course many of those yelling the loudest ardently support.
The buzz in telecom policy circles this morning is the word that Verizon and Google are close to an agreement that will allow the search giant to purchase from Verizon a faster tier for delivery of its bandwidth heavy services, notably YouTube, its video-sharing site.
If the two companies reach an agreement, it could be a death blow to the entire “non-discriminatory” idea behind network neutrality: that no service provider should be give favored treatment to any service or application. FCC Chairman Julius Genachowski has made it a mission to get the “non-discrimination” principle encoded into law, to the point of calling for reclassification of broadband ISPs as regulated telecommunications carriers.
If Verizon sets up tiered pricing for Google applications, the non-discrimination genie is out of the bottle for good. It would be a direct “I dare you” challenge to the FCC to block it. Armageddon indeed. Adding to the significance is that Google itself is party to the deal. Until today at least, Google has been the loudest company behind the call for a non-discrimination rule, even as one-time allies have fallen away (the latest being Amazon.com).
The White House and the Federal Communications Commission have painted themselves into a very tight and very dangerous corner on Net Neutrality. To date, a bi-partisan majority of Congress, labor leaders, consumer groups and, increasingly, some of the initial advocates of open Internet rules are all shouting that the agency has gone off the rails in its increasingly Ahab-like pursuit of an obscure and academic policy objective.
Now comes further evidence, none of it surprising, that all this effort has been a fool’s errand from the start. Jacqui Cheng of Ars Technica is reporting today on a new study from Australia’s University of Ballarat that suggests only .3% of file sharing using the BitTorrent protocol is something other than the unauthorized distribution of copyrighted works. Which is to say that 99.7% of the traffic they sampled is illegal. The Australian study, as Cheng notes, supports similar conclusions of a Princeton University study published earlier this year
In a startling guest column on CNET yesterday, Paul Misener, vice president for global public policy at Amazon.com, for all practical purposes reversed his company’s stand on network neutrality, particularly the controversial non-discrimination rule, which would prohibit ISPs from creating and charging providers of large-scale content, applications and commerce for faster broadband connections and tiered quality of service.
In his column, Misener concedes what many TLF bloggers and friends have argued for years: that the net neutrality rules are a solution in search of a problem, and that large providers like Amazon already invest in techniques that ensure quality delivery of content and apps, albeit at the edge, not within the network cloud. Misener writes:
First, there have been almost no Net neutrality violations. Opponents of Net neutrality rules say this record demonstrates that regulation is unnecessary–that Net neutrality is “a solution in search of a problem.” But actually, the threats of legislation (since 2007) and FCC regulation (since 2009) have kept the network operators on their best behavior.
Moreover, Net neutrality has become a populist consumer issue in a way that few FCC issues ever have (try Web-searching the terms “Net neutrality” or, more humorously, “series of tubes”). So, it’s hard to imagine policymakers adopting laws or rules that would condone popular notions of Net neutrality violations.
Second, the legal/regulatory uncertainties have, understandably, dissuaded network operators from making investments in new technologies and services that might subsequently be found to violate Net neutrality. Unfortunately, some observers seem to think that this uncertainty hurts only the network operators and their suppliers, but consumers and content providers also are suffering, albeit unwittingly, from the lack of new services that might otherwise be available.
Better late than never, I’ve finally given a close read to the Notice of Inquiry issued by the FCC on June 17th. (See my earlier comments, “FCC Votes for Reclassification, Dog Bites Man”.) In some sense there was no surprise to the contents; the Commission’s legal counsel and Chairman Julius Genachowski had both published comments over a month before the NOI that laid out the regulatory scheme the Commission now has in mind for broadband Internet access.
Chairman Genachowski’s “Third Way” comments proposed an option that he hoped would satisfy both extremes. The FCC would abandon efforts to find new ways to meet its regulatory goals using “ancillary jurisdiction” under Title I (an avenue the D.C. Circuit had wounded, but hadn’t actually exterminated, in the Comcast decision), but at the same time would not go as far as some advocates urged and put broadband Internet completely under the telephone rules of Title II.
Today, the Federal Communications Commission (FCC) voted along party lines to adopt a Notice of Inquiry opening a new proceeding to regulate the Internet by reclassifying it under Title II of the Communications Act. FCC Chairman Julius Genachowski calls this his “Third Way” plan. In a PFF press release, I issued the following response:
In its ongoing ‘by-any-means-necessary’ quest to regulate the Internet via Net Neutrality mandates, Chairman Genachowski’s FCC continues to flaunt the rule of law and magically invent its own authority as it goes along. If this Chairman wants to bring the Net under his thumb and regulate broadband networks like plain-vanilla public utilities, he should ask Congress for the authority to pursue such imperial ambitions. As the law stands today, the FCC has no such authority. Indeed, the unambiguously deregulatory thrust of the Telecom Act of 1996 stands in stark contrast to Chairman Genachowski’s outdated vision for Big Government Broadband.
The FCC stands on the cusp of killing one of the great deregulatory success stories of modern economic history by reviving the discredited regulatory industrial policies of the 19th Century. The revisionism about that epoch is dead wrong: Price controls and protected markets limited choice and stifled innovation. With the agency rolling back the regulatory clock in this fashion, today marks the beginning of the Internet’s “Lost Decade” of stymied investment, innovation, and job creation as all sides wage battle over the legality of reclassification and its implementation.
PFF has just published the transcript for an event we hosted last month asking “What Should the Next Communications Act Look Like?” The event featured (in order of appearance) Link Hoewing of Verizon, Walter McCormick of US Telecom, Peter Pitsch of Intel, Barbara Esbin, Ray Gifford of Wilkinson, Barker, Knauer, and Michael Calabrese of the New America Foundation. It was a terrific discussion and it couldn’t have been more timely in light of recent regulatory developments at the FCC. The folks at NextGenWeb were kind enough to make a video of the event and post it online along with a writeup, so I’ve included that video along with the event transcript down below the fold. Continue reading →
State governments are getting bolder about diverting funds intended to maintain and modernize 911 emergency calling systems for other uses.
As states face greater budget gaps spurred by reckless spending and unsustainable obligations to the public sector employees, legislatures have been turning everywhere for extra cash. The 911 surcharge that appears on most consumer phone bills is no exception.
Originally, 911 fees were supposed to be used exclusively to fund 911 calling centers and the training of operators, the primary rationale behind the decision to assess the fees on phone bills. Instead, 911 money is being funneled elsewhere, sometimes for other law enforcement needs like weapons, vehicles and uniforms; sometimes for cost and services that arguably should be funded from general revenues. In New York State, for instance, of the $600 million collected from 911 fees in the past 15 years, just $84 million—14 percent—was used for municipal 911 center operation, according to a Buffalo News report cited by Emergency Management magazine.
The Technology Liberation Front is the tech policy blog dedicated to keeping politicians' hands off the 'net and everything else related to technology. Learn more about TLF →