On Friday evening, I posted on CNET a detailed analysis of the most recent proposal to surface from the secretive upcoming World Conference on International Telecommunications, WCIT 12. The conference will discuss updates to a 1988 UN treaty administered by the International Telecommunications Union, and throughout the year there have been reports that both governmental and non-governmental members of the ITU have been trying to use the rewrite to put the ITU squarely in the Internet business.
The Russian federation’s proposal, which was submitted to the ITU on Nov. 13th, would explicitly bring “IP-based Networks” under the auspices of the ITU, and would in specific substantially if not completely change the role of ICANN in overseeing domain names and IP addresses.
According to the proposal, “Member States shall have the sovereign right to manage the Internet within their national territory, as well as to manage national Internet domain names.” And a second revision, also aimed straight at the heart of today’s multi-stakeholder process, reads: “Member States shall have equal rights in the international allocation of Internet addressing and identification resources.” Continue reading →
Yesterday AT&T announced that it would invest an additional $14 billion in the next three years to expand its 4G LTE network to cover 300 million people and expand its wired all-IP broadband infrastructure to 75 percent of its customer locations throughout its 22-state wired service area. For many consumers, this investment will provide their first opportunity for access to high-speed broadband at home. For many others, it will provide their first opportunity to make a choice among competing providers of high-speed broadband services. This impressive commitment to transition outdated communications infrastructure to an all-IP future will benefit millions of consumers and accelerate our Internet transformation nationwide. Continue reading →
Yesterday it was my privilege to speak at a Free State Foundation (FSF) event on “Ideas for Communications Law and Policy Reform in 2013.” It was moderated by my friend and former colleague Randy May, who is president of FSF, and the event featured opening remarks from the always-excellent FCC Commissioner Robert McDowell.
During the panel discussing that followed, I offered my thoughts about the problem America continues to face in cleaning up communications and media law and proposed a few ideas to get reform done right once and for all. I don’t have time to formally write-up my remarks, but I thought I would just post the speech notes that I used yesterday and include links to the relevant supporting materials. (I’ve been using a canned version of this same speech at countless events over the past 15 years. Hopefully lawmakers will take up some of these reforms some time soon so I’m not using this same set of remarks in 2027!)
Continue reading →
We spend a lot of time here defending the simple proposition that flexible free-market pricing is a good thing. You would think that in 2012 we wouldn’t need to do so, but there’s a growing movement afoot today by some academics, regulatory activists, and public policymakers to have government start asserting more authority over broadband pricing. In particular, they want Congress, the FCC, or state officials to investigate and possibly even regulate efforts by wireline and wireless broadband carriers to use usage-based pricing and data caps as a method of calibrating supply and demand. This was the focus of my last weekly Forbes column, “The Specter Of Broadband Price Controls.” In the piece I note that:
Data caps and usage-based pricing are forms of what economists refer to as price discrimination. Although viewed with suspicion by some policymakers and regulatory-minded academics and activists, price discrimination is widely recognized to improve consumer welfare. Price-differentiated and prioritized services are part of almost every industrial sector in our capitalist economy. Notable examples include airline and hotel reservations, prioritized shipping services, amusement park passes, and fuel and energy pricing. Economists agree that price discrimination represents a sensible way to calibrate supply and demand while ensuring the fixed costs of doing business get covered. Consumers benefit from such pricing experimentation by gaining more options while firms gain more certainty about investment and service decisions.
This is confirmed by an excellent new Mercatus Center working paper on “The Impact of Data Caps and Other Forms of Usage-Based Pricing for Broadband Access,” by Daniel A. Lyons, an assistant professor of law at Boston College Law School. Lyons explains why a return to price controls for communications would be monumentally misguided. Continue reading →
Today the Mercatus Center at George Mason University has released a new working paper by Boston College Law School Professor Daniel Lyons entitled, “The Impact of Data Caps and Other Forms of Usage-Based Pricing for Broadband Access.”
There’s been much hand-wringing about fixed and mobile broadband services increasingly looking to move to usage-based pricing or to impose data caps. Some have even suggested an outright ban on the practice. As Adam Thierer has catalogued in these pages, the ‘net neutrality’ debate has in many ways been leading to this point: pricing flexibility vs. price controls.
In his new paper, Lyons explores the implications of this trend toward usage-based pricing. He finds that data caps and other forms of metered consumption are not inherently anti-consumer or anticompetitive.
Rather, they reflect different pricing strategies through which a broadband company may recover its costs from its customer base and fund future infrastructure investment. By aligning costs more closely with use, usage-based pricing may effectively shift more network costs onto those consumers who use the network the most. Companies can thus avoid forcing light Internet users to subsidize the data-heavy habits of online gamers and movie torrenters. Usage-based pricing may also help alleviate network congestion by encouraging customers, content providers, and network operators to use broadband more efficiently.
Opponents of usage-based pricing have noted that data caps may be deployed for anticompetitive purposes. But data caps can be a problem only when a firm with market power exploits that power in a way that harms consumers. Absent a specific market failure, which critics have not yet shown, broadband providers should be free to experiment with usage-based pricing and other pricing strategies as tools in their arsenal to meet rising broadband demand. Public policies allowing providers the freedom to experiment best preserve the spirit of innovation that has characterized the Internet since its inception.
Lyons does a magnificent job of walking the reader through every aspect of the usage-based pricing issue, its benefits as a cost-recovery and congestion management tool, and its potential anticompetitive effects. “Ultimately, data caps and other pricing strategies are ways that broadband companies can distinguish themselves from one another to achieve a competitive advantage in the marketplace,” he concludes. “When firms experiment with different business models, they can tailor services to niche audiences whose interests are inadequately satisfied by a one-size-fits-all flat-rate plan. Absent anticompetitive concerns, public policy should encourage companies to experiment with different pricing models as a way to compete against one another.”
Looking for a concise overview of how Internet architecture has evolved and a principled discussion of the public policies that should govern the Net going forward? Then look no further than Christopher Yoo‘s new book, The Dynamic Internet: How Technology, Users, and Businesses are Transforming the Network. It’s a quick read (just 140 pages) and is worth picking up. Yoo is a Professor of Law, Communication, and Computer & Information Science at the University of Pennsylvania and also serves as the Director of the Center for Technology, Innovation & Competition there. For those who monitor ongoing developments in cyberlaw and digital economics, Yoo is a well-known and prolific intellectual who has established himself as one of the giants of this rapidly growing policy arena.
Yoo makes two straight-forward arguments in his new book. First, the Internet is changing. In Part 1 of the book, Yoo offers a layman-friendly overview of the changing dynamics of Internet architecture and engineering. He documents the evolving nature of Internet standards, traffic management and congestion policies, spam and security control efforts, and peering and pricing policies. He also discusses the rise of peer-to-peer applications, the growth of mobile broadband, the emergence of the app store economy, and what the explosion of online video consumption means for ongoing bandwidth management efforts. Those are the supply-side issues. Yoo also outlines the implications of changes in the demand-side of the equation, such as changing user demographics and rapidly evolving demands from consumers. He notes that these new demand-side realities of Internet usage are resulting in changes to network management and engineering, further reinforcing changes already underway on the supply-side.
Yoo’s second point in the book flows logically from the first: as the Internet continues to evolve in such a highly dynamic fashion, public policy must as well. Yoo is particularly worried about calls to lock in standards, protocols, and policies from what he regards as a bygone era of Internet engineering, architecture, and policy. “The dramatic shift in Internet usage suggests that its founding architectural principles form the mid-1990s may no longer be appropriate today,” he argues. (p. 4) “[T]he optimal network architecture is unlikely to be static. Instead, it is likely to be dynamic over time, changing with the shifts in end-user demands,” he says. (p. 7) Thus, “the static, one-size-fits-all approach that dominates the current debate misses the mark.” (p. 7) Continue reading →
Vinton Cerf, one of the “fathers of the internet,” discusses what he sees as one of the greatest threats to the internet—the encroachment of the United Nations’ International Telecommunications Union (ITU) into the internet realm. ITU member states will meet this December in Dubai to update international telecommunications regulations and consider proposals to regulate the net. Cerf argues that, as the face of telecommunications is changing, the ITU is attempting to justify its continued existence by expanding its mandate to include the internet. Cerf says that the business model of the internet is fundamentally different from that of traditional telecommunications, and as a result, the ITU’s regulatory model will not work. In place of top-down ITU regulation, Cerf suggests that open multi-stakeholder processes and bilateral agreements may be a better solutions to the challenges of governance on the internet.
Download
Related Links
Tomorrow the Information Economy Project at George Mason University wil present the latest installment of its Tullock Lecture series, featuring Dr. Bronwyn Howell of the New Zealand Institute for the Study of Competition and Regulation. Here is the notice:
Dr. Bronwyn Howell – Tuesday, Sept. 25, 2012
New Zealand Institute for the Study of Competition and Regulation
4:00 to 5:30 pm @ Founder’s Hall Room 111, GMU School of Law, 3301 Fairfax Drive, Arlington, Va. Reception to Follow in the Levy Atrium, 5:30-6:30 pm Admission is free but seating is limited.
“Regulating Broadband Networks: The Global Data for Evidence-Based Public Policy:” Policy makers in the U.S. and around the world are wrestling with “the broadband problem” – how to get advanced forms of Internet access to businesses and consumers. A variety of regulatory approaches have been used, some focusing on incentives to drive deployment of rival networks, others on network sharing mandates or government subsidies. Despite a wealth of diverse experience, there seems to be a great deal of confusion about what the data actually suggest. Few people have studied these data more carefully, however, than New Zealand economist Bronwyn Howell, who will frame the lessons of the global broadband marketplace. Prof. Howell will be introduced by Dr. Scott Wallsten, Senior Fellow at the Technology Policy Institute, who served as Economics Director for the FCC’s National Broadband Plan. RSVP online here or by email to iep.gmu@gmail.com.
I’ve been hearing more rumblings about “API neutrality” lately. This idea, which originated with Jonathan Zittrain’s book, The Future of the Internet–And How to Stop It, proposes to apply Net neutrality to the code/application layer of the Internet. A blog called “The API Rating Agency,” which appears to be written by Mehdi Medjaoui, posted an essay last week endorsing Zittrain’s proposal and adding some meat to the bones of it. (My thanks to CNet’s Declan McCullagh for bringing it to my attention).
Medjaoui is particularly worried about some of Twitter’s recent moves to crack down on 3rd party API uses. Twitter is trying to figure out how to monetize its platform and, in a digital environment where advertising seems to be the only business model that works, the company has decided to establish more restrictive guidelines for API use. In essence, Twitter believes it can no longer be a perfectly open platform if it hopes to find a way to make money. The company apparently believes that some restrictions will need to be placed on 3rd party uses of its API if the firm hopes to be able to attract and monetize enough eyeballs.
While no one is sure whether that strategy will work, Medjaoui doesn’t even want the experiment to go forward. Building on Zittrain, he proposes the following approach to API neutrality:
- Absolute data to 3rd party non-discrimination : all content, data, and views equally distributed on the third party ecosystem. Even a competitor could use an API in the same conditions than all others, with not restricted re-use of the data.
- Limited discrimination without tiering : If you don’t pay specific fees for quality of service, you cannot have a better quality of service, as rate limit, quotas, SLA than someone else in the API ecosystem.If you pay for a high level Quality of service, so you’ll benefit of this high level quality of service, but in the same condition than an other customer paying the same fee.
- First come first served : No enqueuing API calls from paying third party applications, as the free 3rd-party are in the rate limits.
Before I critique this, let’s go back and recall why Zittrain suggested we might need API neutrality for certain online services or digital platforms. Continue reading →
Ryan Radia, associate director of technology studies at the Competitive Enterprise Institute, discusses the amicus brief he helped author in the case of Verizon v. Federal Communications Commission now before the D.C. Circuit Court of Appeals. Radia analyzes the case, which will determine the fate of the FCC’s net neutrality rule. While Verizon is arguing that the FCC does not have the authority to issue suce rules, Radia says that the constitutional implications of the net neutrality rule are more important. He explains that the amicus brief outlines both First and Fifth Amendment arguments against the rule, stating that net neutrality impinges on the speech of Internet service providers and constitutes an illegal taking of their private property.
[Flash 9 is required to listen to audio.]
Download
Related Links