For some time now I’ve been trying to teach myself how to program. I’m proficient in HTML and CSS, and and I could always tinker around the edges of PHP, but I really couldn’t code something from scratch to save my life. Well, there’s no better way to learn than by doing, at least when it comes to programming, so I gave myself a project to complete and, by golly, I did it. It’s called TechWire. It’s a tech policy news aggregator and I’m making it available on the web because I think it might be useful to other tech policy nerds.

Quite simply it’s a semi-curated reverse-chronological list of the most recent tech policy news with links to the original sources. And it’s not just news stories. You’ll also see opinion columns, posts from the various policy shops around town, new scholarly papers, and new books on tech policy. And you can also drill down into just one of these categories to see just the latest news, or just the latest opinions, or just the latest papers. Leave the page open in a tab and the site auto-refreshes as new items come in. Alternatively there is a Twitter feed at @TechWireFTW to get the latest.

Other features include the ability to look up the news for a particular day in the past, as well as clicking on a story to see what other stories are related. That’s especially useful if you want to see how different outlets are covering the same issue or if you want to see how an issue has developed over time. Just click on the linked timestamp at the end of a story to see related posts.

I hope TechWire (at http://techwireftw.com/) is as useful to you as it was fun for me to code. I’d like to thank some folks who really helped me along the way: Pete Snyder and Eli Dourado for putting up with my dumb questions, Cord Blomquist for great hosting and serene patience, Adam Thierer for being the best QA department I could wish for, and my wife Kathleen for putting up with me staring at the computer for hours.

Today the Mercatus Center at George Mason University has released a new working paper by Boston College Law School Professor Daniel Lyons entitled, “The Impact of Data Caps and Other Forms of Usage-Based Pricing for Broadband Access.”

There’s been much hand-wringing about fixed and mobile broadband services increasingly looking to move to usage-based pricing or to impose data caps. Some have even suggested an outright ban on the practice. As Adam Thierer has catalogued in these pages, the ‘net neutrality’ debate has in many ways been leading to this point: pricing flexibility vs. price controls.

In his new paper, Lyons explores the implications of this trend toward usage-based pricing. He finds that data caps and other forms of metered consumption are not inherently anti-consumer or anticompetitive.

Rather, they reflect different pricing strategies through which a broadband company may recover its costs from its customer base and fund future infrastructure investment. By aligning costs more closely with use, usage-based pricing may effectively shift more network costs onto those consumers who use the network the most. Companies can thus avoid forcing light Internet users to subsidize the data-heavy habits of online gamers and movie torrenters. Usage-based pricing may also help alleviate network congestion by encouraging customers, content providers, and network operators to use broadband more efficiently.

Opponents of usage-based pricing have noted that data caps may be deployed for anticompetitive purposes. But data caps can be a problem only when a firm with market power exploits that power in a way that harms consumers. Absent a specific market failure, which critics have not yet shown, broadband providers should be free to experiment with usage-based pricing and other pricing strategies as tools in their arsenal to meet rising broadband demand. Public policies allowing providers the freedom to experiment best preserve the spirit of innovation that has characterized the Internet since its inception.

Lyons does a magnificent job of walking the reader through every aspect of the usage-based pricing issue, its benefits as a cost-recovery and congestion management tool, and its potential anticompetitive effects. “Ultimately, data caps and other pricing strategies are ways that broadband companies can distinguish themselves from one another to achieve a competitive advantage in the marketplace,” he concludes. “When firms experiment with different business models, they can tailor services to niche audiences whose interests are inadequately satisfied by a one-size-fits-all flat-rate plan. Absent anticompetitive concerns, public policy should encourage companies to experiment with different pricing models as a way to compete against one another.”

Scott Shackelford, assistant professor of business law and ethics at Indiana University, and author of the soon-to-be-published book Managing Cyber Attacks in International Law, Business, and Relations: In Search of Cyber Peace, explains how polycentric governance could be the answer to modern cybersecurity concerns.

Shackelford  originally began researching collective action problems in physical commons, including Antarctica, the deep sea bed, and outer space, where he discovered the efficacy of polycentric governance in addressing these issues. Noting the similarities between these communally owned resources and the Internet, Shackelford was drawn to the idea of polycentric governance as a solution to the collective action problems he identified in the online realm, particularly when it came to cybersecurity.

Shackelford contrasts the bottom-up form of governance characterized by self-organization and networking regulations at multiple levels to the increasingly state-centric approach prevailing in forums like the International Telecommunication Union (ITU).  Analyzing the debate between Internet sovereignty and Internet freedom through the lens of polycentric regulation, Shackelford reconceptualizes both cybersecurity and the future of Internet governance.

Download

Related Links

Looking for a concise overview of how Internet architecture has evolved and a principled discussion of the public policies that should govern the Net going forward? Then look no further than Christopher Yoo‘s new book, The Dynamic Internet: How Technology, Users, and Businesses are Transforming the Network. It’s a quick read (just 140 pages) and is worth picking up.  Yoo is a Professor of Law, Communication, and Computer & Information Science at the University of Pennsylvania and also serves as the Director of the Center for Technology, Innovation & Competition there. For those who monitor ongoing developments in cyberlaw and digital economics, Yoo is a well-known and prolific intellectual who has established himself as one of the giants of this rapidly growing policy arena.

Yoo makes two straight-forward arguments in his new book. First, the Internet is changing. In Part 1 of the book, Yoo offers a layman-friendly overview of the changing dynamics of Internet architecture and engineering. He documents the evolving nature of Internet standards, traffic management and congestion policies, spam and security control efforts, and peering and pricing policies. He also discusses the rise of peer-to-peer applications, the growth of mobile broadband, the emergence of the app store economy, and what the explosion of online video consumption means for ongoing bandwidth management efforts. Those are the supply-side issues. Yoo also outlines the implications of changes in the demand-side of the equation, such as changing user demographics and rapidly evolving demands from consumers. He notes that these new demand-side realities of Internet usage are resulting in changes to network management and engineering, further reinforcing changes already underway on the supply-side.

Yoo’s second point in the book flows logically from the first: as the Internet continues to evolve in such a highly dynamic fashion, public policy must as well. Yoo is particularly worried about calls to lock in standards, protocols, and policies from what he regards as a bygone era of Internet engineering, architecture, and policy. “The dramatic shift in Internet usage suggests that its founding architectural principles form the mid-1990s may no longer be appropriate today,” he argues. (p. 4) “[T]he optimal network architecture is unlikely to be static. Instead, it is likely to be dynamic over time, changing with the shifts in end-user demands,” he says. (p. 7) Thus, “the static, one-size-fits-all approach that dominates the current debate misses the mark.” (p. 7) Continue reading →

Designer Dan Provost, co-founder of the indie hardware and software company Studio Neat, and co-author of It Will Be Exhilarating: Indie Capitalism and Design Entrepreneurship in the 21st Century, discusses how technological innovation helped him build his business. Provost explains how he and his co-founder Tom Gerhardt were able to rely on crowdfunding to finance their business. Avoiding loans or investors, he says, has allowed them to more freely experiment and innovate. Provost also credits 3D printing for his company’s success, saying their hardware designs–very popular tripod mounts for the iPhone and a stylus for the iPad–would not have been possible without the quick-prototyping technology.

Download

Related Links

If the FCC had adopted the eligibility restrictions proposed by PISC in 2007, the United States would not have achieved the LTE leadership touted by current FCC Chairman Genachowski.

I was pleased to see FCC Chairman Genachowski praise the market-based policies of his predecessors in his remarks at Vox last week. He noted that the United States is currently leading the world in next generation mobile wireless services with 69 percent of the world’s LTE subscribers, which he attributes to “smart government policies.” He didn’t mention, however, that the “smart government policies” that led to America’s renewed mobile leadership were based on market principles adopted by the previous FCC. Continue reading →

On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.

California now joins over twenty states that have enacted similar legislation.

The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February.  It passed both houses of the California legislature with wide bi-partisan majorities.

California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.

Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime. Continue reading →

Vinton Cerf, one of the “fathers of the internet,” discusses what he sees as one of the greatest threats to the internet—the encroachment of the United Nations’ International Telecommunications Union (ITU) into the internet realm. ITU member states will meet this December in Dubai to update international telecommunications regulations and consider proposals to regulate the net. Cerf argues that, as the face of telecommunications is changing, the ITU is attempting to justify its continued existence by expanding its mandate to include the internet. Cerf says that the business model of the internet is fundamentally different from that of traditional telecommunications, and as a result, the ITU’s regulatory model will not work. In place of top-down ITU regulation, Cerf suggests that open multi-stakeholder processes and bilateral agreements may be a better solutions to the challenges of governance on the internet.

Download

Related Links

Tomorrow the Information Economy Project at George Mason University wil present the latest installment of its Tullock Lecture series, featuring Dr. Bronwyn Howell of the New Zealand Institute for the Study of Competition and Regulation. Here is the notice:

Dr. Bronwyn Howell – Tuesday, Sept. 25, 2012
New Zealand Institute for the Study of Competition and Regulation
4:00 to 5:30 pm @ Founder’s Hall Room 111, GMU School of Law, 3301 Fairfax Drive, Arlington, Va. Reception to Follow in the Levy Atrium, 5:30-6:30 pm Admission is free but seating is limited.

“Regulating Broadband Networks: The Global Data for Evidence-Based Public Policy:” Policy makers in the U.S. and around the world are wrestling with “the broadband problem” – how to get advanced forms of Internet access to businesses and consumers. A variety of regulatory approaches have been used, some focusing on incentives to drive deployment of rival networks, others on network sharing mandates or government subsidies. Despite a wealth of diverse experience, there seems to be a great deal of confusion about what the data actually suggest. Few people have studied these data more carefully, however, than New Zealand economist Bronwyn Howell, who will frame the lessons of the global broadband marketplace. Prof. Howell will be introduced by Dr. Scott Wallsten, Senior Fellow at the Technology Policy Institute, who served as Economics Director for the FCC’s National Broadband Plan. RSVP online here or by email to iep.gmu@gmail.com.

Aereo LogoRyan Radia recently posted an impassioned and eminently reasonable defense of copyright with which I generally agree, especially since he acknowledges that “our Copyright Act abounds with excesses and deficiencies[.]” However, Ryan does this in the context of defending broadcaster rights against internet retransmitters, such as ivi and Aereo, and I have a bone to pick with that. He writes,

[Copyright] is why broadcasters may give their content away for free to anybody near a metropolitan area who has an antenna and converter box, while simultaneously preventing third parties like ivi from distributing the same exact content (whether free of charge or for a fee). At first, this may seem absurd, but consider how many websites freely distribute their content on the terms they see fit. That’s why I can read all the Techdirt articles I desire, but only on Techdirt’s website. If copyright protection excluded content distributed freely to the general public, creators of popular ad-supported content would soon find others reproducing their content with fewer ads.

I think what Ryan is missing is that copyright is not why broadcasters give away their content for free over the air. The real reason is that they are required to do so as a condition of their broadcast license. In exchange for free access to one of the main inputs of their business–spectrum–broadcasters agree to make their signal available freely to the public. Also, the fact that TV stations broadcast to metro areas (and not regionally or nationally) is not the product of technical limitations or business calculus, but because the FCC decided to only offer metro-sized licenses in the name of “localism.” That’s not a system I like, but it’s the system we have.

So, if what the public gets for giving broadcasters free spectrum is the right to put up an antenna and grab the signals without charge, why does it matter how they do it? To me a service like Aereo is just an antenna with a very long cable to one’s home, just like the Supreme Court found about CATV systems in Fortnightly. What broadcasters are looking to do is double-dip. They want free spectrum, but then they also want to use copyright to limit how the public can access their over-the-air signals. To address Ryan’s analogy from above, Techdirt is not like a broadcaster because it isn’t getting anything from the government in exchange for a “public interest” obligation.

Ideally, of course, spectrum would be privatized. In that world I think we’d see little if any ad-supported broadcast TV because there are much better uses for the spectrum. If there was any broadcast TV, it would be national or regional as there is hardly any market for local content. And the signal would likely be encrypted and pay-per-view, not free over-the-air. In such a world the copyright system Ryan favors makes sense, but that’s not the world we live in. As long as the broadcasters are getting free goodies like spectrum and must-carry, their copyright claims ring hollow.