“Preserving the Internet,” But Which One?: Reading the FCC’s Net Neutrality Order (Part IV)

by on January 12, 2011 · 5 comments

This is Part IV of a five-part commentary on the FCC’s Dec. 23, 2010 “Open Internet” Report and Order.

Part I looked at the remarkably weak justification the majority gave for issuing the new rules.

Part II explored the likely costs of the rules, particularly the undiscussed costs of enforcement that will be borne by the agency and accused broadband access providers, regardless of the merits.  (See Adam Thierer’s post on the first attenuated claim of violation, raised before the rules even take effect.)

Part III compared the final text of the rules to earlier drafts and alternative proposals, tracing the Commission’s changing and sometimes contradictory reasoning over the last year.

Part IV, (this part), looks at the many exceptions and carve-outs from the rules, and what,  taken together, they say about the majority’s dogged determination to see the Internet as it was and not as it is or will become.

Part V will review the legal basis on which the majority rests its authority for the rules, likely to be challenged in court.

What does an Open Internet mean?

The idea of the “open Internet” is relatively simple:  consumers of broadband Internet access should have the ability to surf the web as they please and enjoy the content of their choice, without interference by access providers who may have financial or other anti-competitive reasons to shape or limit that access.

In the act of trying to translate that idea into enforceable rules—enforceable, inexplicably, by a federal regulatory agency with no legislative authority over any substantial feature of the Internet economy and no real justification for creating rules of any kind for a system that is working nearly flawlessly so far—the FCC has found itself tied in unholy knots.

The rules as enacted carved out exceptions and caveats that, taken together, render the final regulations not meaningless but certainly incoherent.

In exempting from the rules a host of important innovations in network management and infrastructure optimization developed over the last decade, the FCC has stepped back from the brink of its original plan, which would have returned the Internet to the days of unreliable dial-up access and static websites.

But it has also revealed the danger of trying to regulate a rapidly-evolving life form, and risked the unintended consequence of denying it future forms of nutrition and good health.  If these rules stand and are vigorously enforced, the Internet’s further growth and development may be stunted.

The Mythical Neutrality Principle

Back in the stone age of 1998, I wrote in “Unleashing the Killer App” that one of the fundamental bases on which the Internet became an engine of innovation and even social change was that its basic protocols are non-proprietary.  Anyone can make use of them, any device can support them, and every node is a peer—without paying royalties or other tribute to anyone.  As the “lowest common denominator” standard, TCP/IP benefited from network effects to overtake several popular proprietary standards, including IBM’s SNA.

The technical and legal openness of TCP/IP has been romanticized over the years, particularly by legal scholars and journalists who know less about technology than they think they do, into a view of the Internet as a Platonic ideal;  a vehicle for true collaboration and consciousness-raising.  The web was nothing less than the fruition, as Tim O’Reilly put it, “of “what we were talking about at Esalen in the ’70s—except we didn’t know it would be technology-mediated.”

The ideal of neutrality—of a level playing field in which every website, application, and device is no more prominent than any other–is a persistent and compelling myth.  It evokes the heroism of the entrepreneur in the garage, developing the next Yahoo or Google or YouTube or Facebook or Twitter or Groupon, with little more than a great idea, technical skills, and the willingness to sacrifice sleep and social life for the promise of a future liquidity event.  Optimally, the great IPO, or to change the world and make it a better place by connecting people and information in new and unexpected ways.  Wikipedia, for example.

Whatever the motivation, after a grueling race against the clock, the app is released.  If all goes well, it reaps the benefit of Metcalfe’s Law, goes viral, and becomes the next Big Thing, all in the span of time between one SXSW conference and the next Web 2.0 Summit.

No large corporation can stop the plucky inventor, or ransom a part of her invention.  No access provider can hold its invaluable user base hostage.  No competing content provider, no matter how giant, can buy up all the available market channels and freeze out the upstart start-up.  No government regulator need approve or license the invention before human testing and general use can begin.

When Worlds Collide

A considerably more mundane version of that ideal world did exist in the last half of the 1990’s.  It still exists today.  But it has become much more complex and nuanced in the last decade.

The Internet, the Web, the Cloud and the app-based economy of wireless computing devices, TVs and increasingly other things (including cars and other non-traditional computing platforms such as consumer electronics and home appliances) have evolved in interesting and productive ways, often “under the covers” of the network infrastructure.

Few consumers know or would care to know about the existence, let alone the details, of network optimization algorithms, content delivery networks, complex peering arrangements, caching and edge servers, file torrenting, mirror sites, specialized services, virtual private networks, packet prioritization based on media type, spam and other malware filters, dynamic IP addresses or domain name redirection.

All of these (and more) are mechanisms for speeding up the delivery of the most popular or the most bandwidth intensive content.  Many have been developed by entrepreneurs or by the large access and hosting services, often working in concert with the voluntary protocol and technical committees of the Internet Society.

ISOC keeps the standards alive, flexible, and responsive to new opportunities for expansion and reinvention made possible through the agency of Moore’s Law, which continues to drive the basic technological components of digital life into the uncharted realm of the faster, cheaper, and smaller.

Strictly speaking, of course, all of these innovations violate the neutrality principle.  They recognize that some packets, either because of file size or popularity or media characteristics or importance to the recipient, requires special treatment in the transport from host to client.

Video (YouTube, Hulu, Netflix), for example, can consist of very large files, and the  component packets must arrive at their destination with relatively short delays in order to maintain the integrity of streaming display.

Hosted services, such as medical monitoring, use parts of the same infrastructure as the public Internet, but cannot safely be left to the normal ebb and flow of Internet traffic patterns.  Limitations of the 3G wireless infrastructure—in large part a result of regulatory restrictions on cell siting and spectrum mismanagement—make it difficult to satisfy exploding customer demand for ever-more of the most bandwidth-intensive apps.

When all is said and done, the core problem with the FCC’s Open Internet Report and Order comes down to a clash of the idealized view of the neutral Internet with the reality of an always-evolving, always-improving technology infrastructure.

Chairman Genachowski, himself a former venture capitalist, is clinging to the myth of the Internet as virtual frontier, an understandable but highly dangerous indulgence in nostalgia, a remembrance of Internets past.  He’s not alone.  The romance of the American west has persisted more than a hundred years since historian Frederick Jackson Turner famously declared the frontier closed.

As he said in introducing the Open Internet proceeding in September, 2009, shortly after taking office:

“The Internet’s creators didn’t want the network architecture — or any single entity — to pick winners and losers. Because it might pick the wrong ones. Instead, the Internet’s open architecture pushes decision-making and intelligence to the edge of the network — to end users, to the cloud, to businesses of every size and in every sector of the economy, to creators and speakers across the country and around the globe. In the words of Tim Berners-Lee, the Internet is a ‘blank canvas’ — allowing anyone to contribute and to innovate without permission.”

Many of us fortunate enough to have been there at the moment the Internet reached its tipping point and became an unstoppable force, a kind of network gravity, share this nostalgia.  It was a moment that changed the trajectory of computing, upended giants, and unleashed tremendous creativity.  For me, it utterly transformed my career, much as my first FORTRAN course as an undergraduate had unintentionally started it.

But the effort to translate nostalgia into federal law—assuming, but only for the moment, that the FCC is the appropriate agency to preserve an Internet that has long since passed even if it was ever the way we old-timers remember it—has already fallen down more than its fair share of abandoned mine shafts.

.

The Exceptions that Expose the Rule

Even the original Notice of Proposed Rulemaking and draft order released for comment in October, 2009 included many (necessary) exceptions from strict adherence to the neutrality principle.

The proposed rules, most important, limited all six neutrality rules (§§ 8.5-8.15) to an exception for “reasonable network management.”  Reasonable network management was defined as all “reasonable practices” broadband Internet access providers undertook to, among other things, “reduce or mitigate the effects of congestion on the network or to address quality-of-service concerns.”  (§ 8.3).  And bowing to legal limits to neutrality, reasonable network management did not apply to efforts by broadband access providers to “address unlawful conduct on the Internet,” including unlicensed sharing of copyrighted content. (¶ 139)

In explaining “reasonable network management,” (¶¶ 135-141), the FCC acknowledged that the technology by which a user accessed the Internet could play a significant role in determining when a provider could act “inconsistently” with the neutrality principle but still not violate the rules.  Access over coaxial cable follows a different architecture—with different constraints—than fiber, copper, satellite, or cellular access.  For purposes of “quality of service,” the agency acknowledged that it might be appropriate for an access provider to implement a “network management practice of prioritizing classes of latency-sensitive traffic,” such as VoIP, gaming, and streaming media traffic.  (¶137)

Since the FCC has up until now had little role to play in the regulation of the Internet, it’s not surprising that the agency began this process with a highly outdated view of how the Internet “worked.”  So the NPRM here and in eighty other sections, sought comment on the current state of the Internet ecosystem, the technologies of broadband access, network management principles in place, and the nature of the broadband access market throughout the U.S.—the latter a subject the agency took up again in the National Broadband Plan.

Not surprisingly, the FCC heard plenty.  The final report lists over 450 sources of comments and replies to the NPRM, many of which addressed themselves to educating the FCC on the technologies it had undertaken to regulate.

As a result of this formal (and no doubt a great deal of informal) feedback, the final rules added numerous additional exceptions, authorizing a wide range of ways a provider of broadband Internet access could act “inconsistently” with the neutrality principle but still not be thought to have violated them.

The new exceptions include:

  • Exemption from many of the rules for all providers of mobile broadband Internet access, including the “no unreasonable discrimination” rule and some of the “no blocking” rule.  (§ 8.5, 8.7)
  • Explicit exemption from the “no blocking” rule for app stores and other control mechanisms used by mobile broadband providers.  (¶ 102)
  • A change from a strict “nondiscrimination” rule for wireline providers to a rule prohibiting only “unreasonable discrimination.” (§ 8.7)  (See Part III for a discussion of the difference between those two formulations.)
  • A limited definition of “broadband Internet access service” that applies the rules only to providers of a “mass market retail service” providing “the capability to transmit data to and receive data from all or substantially all Internet endpoint.”  (§ 8.11(a)  That change leaves out a range of relatively new Internet devices and services—including the Amazon Kindle, game consoles, cars, TVs and refrigerators—that offer some form of web access incidental to their main purpose in connecting to the network.  (See ¶ 47)
  • A broader definition of “reasonable network management,” that includes any practice that is “appropriate and tailored to achieving a legitimate network management purpose.”  (§ 8.11(d) and see ¶ 82)
  • Exemption for virtual private networks, which use much of the same infrastructure as the public Internet. (¶ 47)
  • Exemption for Content Delivery Networks and co-located servers that put particular content in closer proximity to important network nodes and therefore speed its transmission to requesting users. (see ¶ 47 and ¶ 76 note 235)
  • Exemption for multichannel video programming services (e.g., U-verse) that use TCP/IP protocols and existing Internet infrastructure.  (¶ 47)
  • Exemption for Internet backbone services.  (¶ 47)
  • Exemption for hosting or data storage services. (¶ 47)
  • Exemptions for “coffee shops, bookstores, airlines and other entities when they acquire Internet service from a broadband provider to enable their patrons to access the Internet from their establishments.” (¶ 52)
  • Exemption from the discrimination rule for “existing arrangements for network interconnection, including existing peering arrangements.”  (¶ 67 n. 209)
  • Exemption (for now) ­ for “specialized services,” including multichannel video programming (see above) or facilities-based VoIP, that “share capacity with broadband Internet access services over providers’ last-mile facilities.”  (¶¶ 112-114)
  • A hedge on whether “paid priority” of some content, either of the access provider or a third party, would necessarily violate the “unreasonable discrimination” rule (¶ 76), and an explicit rejection of the argument that CDNs constitute illegal “pay for priority” though they have the same effect on consumer experience as prohibited prioritization schemes.  (¶ 77)
  • Recognition that end-users may elect to acquire Internet access that limits their choice of content, including services that support parental controls or which “allow end users to choose a service that provides access to the Internet but not to pornographic websites.”  (¶ 89).  Further, “[b]roadband providers are also free under this Order to offer a wide range of ‘edited’ services,” including a “service limited to ‘family friendly’ materials.”  (¶ 143, cf. ¶ 141)
  • Recognition that existing federal law allows all Internet Service Providers to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”   (¶ 89 n. 279)

Finding the Forest Amid the Exemptions

Of course these exceptions, particularly the measured approach to mobile broadband access and the provisional reprieve for specialized services, generated howls of indignation from advocacy groups hoping for pure neutrality, and led many of the Chairman’s initial supporters to abandon him over the course of the year the NPRM was publicly and privately debated.

My concern is quite different.  I think each of these exceptions makes good sense, and will keep the new rules, at least in the short-term, from causing life-threatening damage to the Internet ecosystem.

Rather, what the laundry list of exceptions demonstrates is that the majority just isn’t seeing the forest for the trees.  What the exceptions have in common is that each of them represents change to the Internet’s architecture and service models that have emerged in the last decade and a half. They are all new services, technologies, or service providers who, in these and other ways, violate the neutrality principle.

But these innovations have been developed for beneficial, not evil purposes.  The network is better in every sense imaginable, and will continue to improve in speed, efficiency, and usability so long as future innovations don’t run afoul of the rules and their enforcement.  The Internet is not “open” in the way it may have been in 1995 (it was never as open as the idealists imagine).  But in order for the Internet we have today—faster, cheaper, better—to exist, each of these changes had to be made.

The genius of a virtual infrastructure is that it can absorb redesign without any interruption in service.  One unfortunate side-effect of that ease of transformation is that users don’t see the construction cones and highway workers.  Consumers—and the FCC–don’t realize that we’re now traveling on a multi-lane highway rather than the old dirt road.  The technology is utterly changed, and the rules of the road have changed with it.  For better or worse, but largely for the better.

The final rules, with all their exceptions, suggest a majority clinging to the idealized past, and a stubborn refusal in the end to admit that the Internet has changed and continues to change—that it needs to change.

The exceptions for  the “inconsistent” behavior of CDNs, specialized services, peering arrangements, e-readers and game consoles, and app stores have no logical rationale, other than that the FCC has now learned that they are part of the current status quo.  But they are being exempted because they are in place, and they work.

For example, paying a CDN to replicate your content and co-locate servers at key network access points is surely “paying for priority.”  It puts a start-up offering similar content but without the funds for similar services at a competitive disadvantage.  The cached content will arrive faster when requested by a consumer.  But for consumers, that feature is a good thing—an improvement—even though it is not “neutral.”

Likewise, the mobile Internet is given special treatment because it is “evolving rapidly.” (¶ 8)  But the fixed Internet is evolving rapidly as well, as many of these exemptions implicitly recognize.

The majority is fixated on maintaining a neutral Internet even though it now understands that neutrality is a virtue more honored in the breach.  The final report uses the word “traditionally” 25 times, the word “historically” 9 times, and the word “typically” 21 times.  These are the only justifications for the exceptions, and they undermine the purpose of the rules that remain.  There is no neutral Internet to preserve.  There’s only one that works.

The reality is that we’re moving away from websites to the mobile, app-based economy, specialized services and high-bandwidth applications such as video that shouldn’t be treated the same.  A “level playing field” doesn’t mean everyone gets a trophy

The good news is that the final rules grandfather in many existing technologies that violate the neutrality principle.  That’s essential, even if each of the exceptions is granted in isolation and begrudgingly at that.

But the bad news is that the open Internet regulations as approved allow little flexibility for future innovations in network optimization.  The FCC sees ominous clouds of non-neutral and therefore prohibited behavior on the network horizon, even though tomorrow’s violations are only as dangerous as the “traditions” that have been established up until this random moment in Internet time.  The vote comes at a politically significant moment, but not a time that has any particular meaning for the network’s engineering.  The new rules, in the worst case, may arbitrarily freeze today’s particular status quo, for no good (and lots of bad) reasons.

Nostalgia can be fun.  I enjoy sitting around with my fellow veterans of the pre-bubble dot com boom talking about the good old days, toasting to our irrational exuberance. But translating that wistfulness into federal law, even as here with rules pockmarked by the blemishes of a reality that looks far different than our idealized view of the past, is a dangerous way to celebrate it.

Next:  Not to worry.  The FCC has no authority, either.

Previous post:

Next post: