On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.
California now joins over twenty states that have enacted similar legislation.
The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February. It passed both houses of the California legislature with wide bi-partisan majorities.
California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.
Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime. Continue reading →
Vinton Cerf, one of the “fathers of the internet,” discusses what he sees as one of the greatest threats to the internet—the encroachment of the United Nations’ International Telecommunications Union (ITU) into the internet realm. ITU member states will meet this December in Dubai to update international telecommunications regulations and consider proposals to regulate the net. Cerf argues that, as the face of telecommunications is changing, the ITU is attempting to justify its continued existence by expanding its mandate to include the internet. Cerf says that the business model of the internet is fundamentally different from that of traditional telecommunications, and as a result, the ITU’s regulatory model will not work. In place of top-down ITU regulation, Cerf suggests that open multi-stakeholder processes and bilateral agreements may be a better solutions to the challenges of governance on the internet.
Download
Related Links
Tomorrow the Information Economy Project at George Mason University wil present the latest installment of its Tullock Lecture series, featuring Dr. Bronwyn Howell of the New Zealand Institute for the Study of Competition and Regulation. Here is the notice:
Dr. Bronwyn Howell – Tuesday, Sept. 25, 2012
New Zealand Institute for the Study of Competition and Regulation
4:00 to 5:30 pm @ Founder’s Hall Room 111, GMU School of Law, 3301 Fairfax Drive, Arlington, Va. Reception to Follow in the Levy Atrium, 5:30-6:30 pm Admission is free but seating is limited.
“Regulating Broadband Networks: The Global Data for Evidence-Based Public Policy:” Policy makers in the U.S. and around the world are wrestling with “the broadband problem” – how to get advanced forms of Internet access to businesses and consumers. A variety of regulatory approaches have been used, some focusing on incentives to drive deployment of rival networks, others on network sharing mandates or government subsidies. Despite a wealth of diverse experience, there seems to be a great deal of confusion about what the data actually suggest. Few people have studied these data more carefully, however, than New Zealand economist Bronwyn Howell, who will frame the lessons of the global broadband marketplace. Prof. Howell will be introduced by Dr. Scott Wallsten, Senior Fellow at the Technology Policy Institute, who served as Economics Director for the FCC’s National Broadband Plan. RSVP online here or by email to iep.gmu@gmail.com.
Ryan Radia recently posted an impassioned and eminently reasonable defense of copyright with which I generally agree, especially since he acknowledges that “our Copyright Act abounds with excesses and deficiencies[.]” However, Ryan does this in the context of defending broadcaster rights against internet retransmitters, such as ivi and Aereo, and I have a bone to pick with that. He writes,
[Copyright] is why broadcasters may give their content away for free to anybody near a metropolitan area who has an antenna and converter box, while simultaneously preventing third parties like ivi from distributing the same exact content (whether free of charge or for a fee). At first, this may seem absurd, but consider how many websites freely distribute their content on the terms they see fit. That’s why I can read all the Techdirt articles I desire, but only on Techdirt’s website. If copyright protection excluded content distributed freely to the general public, creators of popular ad-supported content would soon find others reproducing their content with fewer ads.
I think what Ryan is missing is that copyright is not why broadcasters give away their content for free over the air. The real reason is that they are required to do so as a condition of their broadcast license. In exchange for free access to one of the main inputs of their business–spectrum–broadcasters agree to make their signal available freely to the public. Also, the fact that TV stations broadcast to metro areas (and not regionally or nationally) is not the product of technical limitations or business calculus, but because the FCC decided to only offer metro-sized licenses in the name of “localism.” That’s not a system I like, but it’s the system we have.
So, if what the public gets for giving broadcasters free spectrum is the right to put up an antenna and grab the signals without charge, why does it matter how they do it? To me a service like Aereo is just an antenna with a very long cable to one’s home, just like the Supreme Court found about CATV systems in Fortnightly. What broadcasters are looking to do is double-dip. They want free spectrum, but then they also want to use copyright to limit how the public can access their over-the-air signals. To address Ryan’s analogy from above, Techdirt is not like a broadcaster because it isn’t getting anything from the government in exchange for a “public interest” obligation.
Ideally, of course, spectrum would be privatized. In that world I think we’d see little if any ad-supported broadcast TV because there are much better uses for the spectrum. If there was any broadcast TV, it would be national or regional as there is hardly any market for local content. And the signal would likely be encrypted and pay-per-view, not free over-the-air. In such a world the copyright system Ryan favors makes sense, but that’s not the world we live in. As long as the broadcasters are getting free goodies like spectrum and must-carry, their copyright claims ring hollow.
I’ve been hearing more rumblings about “API neutrality” lately. This idea, which originated with Jonathan Zittrain’s book, The Future of the Internet–And How to Stop It, proposes to apply Net neutrality to the code/application layer of the Internet. A blog called “The API Rating Agency,” which appears to be written by Mehdi Medjaoui, posted an essay last week endorsing Zittrain’s proposal and adding some meat to the bones of it. (My thanks to CNet’s Declan McCullagh for bringing it to my attention).
Medjaoui is particularly worried about some of Twitter’s recent moves to crack down on 3rd party API uses. Twitter is trying to figure out how to monetize its platform and, in a digital environment where advertising seems to be the only business model that works, the company has decided to establish more restrictive guidelines for API use. In essence, Twitter believes it can no longer be a perfectly open platform if it hopes to find a way to make money. The company apparently believes that some restrictions will need to be placed on 3rd party uses of its API if the firm hopes to be able to attract and monetize enough eyeballs.
While no one is sure whether that strategy will work, Medjaoui doesn’t even want the experiment to go forward. Building on Zittrain, he proposes the following approach to API neutrality:
- Absolute data to 3rd party non-discrimination : all content, data, and views equally distributed on the third party ecosystem. Even a competitor could use an API in the same conditions than all others, with not restricted re-use of the data.
- Limited discrimination without tiering : If you don’t pay specific fees for quality of service, you cannot have a better quality of service, as rate limit, quotas, SLA than someone else in the API ecosystem.If you pay for a high level Quality of service, so you’ll benefit of this high level quality of service, but in the same condition than an other customer paying the same fee.
- First come first served : No enqueuing API calls from paying third party applications, as the free 3rd-party are in the rate limits.
Before I critique this, let’s go back and recall why Zittrain suggested we might need API neutrality for certain online services or digital platforms. Continue reading →
Consumers should be aware that “government transparency” also applies to the data consumers voluntarily provide to the FCC when they participate in a government-run broadband measurement program.
The most egregious aspect of these broadband measurement programs, however, is that the FCC kept the public in the dark for more than a year by failing to disclose that its mobile testing apps were collecting user locations (by latitude and longitude) and unique handset identification numbers that the FCC’s contractors can make available to the public.
The Federal Communications Commission (FCC) recently announced a new program to measure mobile broadband performance in the United States. The FCC believes it is “difficult” for consumers to get detailed information about their mobile broadband performance, and that “transparency on broadband speeds drives improvement in broadband speeds.” The FCC does not, however, limit transparency to broadband speeds. Consumers should be aware that “government transparency” also applies to the data consumers voluntarily provide to the FCC when they participate in a government-run broadband measurement program. Information collected by the FCC about individual consumers may be “routinely disclosed” to other federal agencies, states, or local agencies that are investigating or prosecuting a civil or criminal violation. Some personal information, including individual IP address, mobile handset location data, and unique handset identification numbers, may be released to the public.
This blog post describes the FCC’s broadband measurement programs and highlights the personal data that may be disclosed about those who participate in them. Continue reading →
There are a lot of inaccurate claims – and bad economics – swirling around the Universal Music Group (UMG)/EMI merger, currently under review by the US Federal Trade Commission and the European Commission (and approved by regulators in several other jurisdictions including, most recently, Australia). Regulators and industry watchers should be skeptical of analyses that rely on outmoded antitrust thinking and are out of touch with the real dynamics of the music industry.
The primary claim of critics such as the American Antitrust Institute and Public Knowledge is that this merger would result in an over-concentrated music market and create a “super-major” that could constrain output, raise prices and thwart online distribution channels, thus harming consumers. But this claim, based on a stylized, theoretical economic model, is far too simplistic and ignores the market’s commercial realities, the labels’ self-interest and the merger’s manifest benefits to artists and consumers.
For market concentration to raise serious antitrust issues, products have to be substitutes. This is in fact what critics argue: that if UMG raised prices now it would be undercut by EMI and lose sales, but that if the merger goes through, EMI will no longer constrain UMG’s pricing power. However, the vast majority of EMI’s music is not a substitute for UMG’s. In the real world, there simply isn’t much price competition across music labels or among the artists and songs they distribute. Their catalogs are not interchangeable, and there is so much heterogeneity among consumers and artists (“product differentiation,” in antitrust lingo) that relative prices are a trivial factor in consumption decisions: No one decides to buy more Lady Gaga albums because the Grateful Dead’s are too expensive. The two are not substitutes, and assessing competitive effects as if they are, simply because they are both “popular music,” is not instructive. Continue reading →
Imagine a service that livestreams major broadcast television channels over the Internet for $4.99 a month — no cable or satellite subscription required. For an extra 99 cents a month, the service offers DVR functionality, making it possible to record, rewind, and pause live broadcast television on any broadband-equipped PC.
If this service sounds too good to be true, that’s because it is. But for a time, it was the business model of ivi. Cheaper than a cable/satellite/fiber subscription and more reliable than an over-the-air antenna, ivi earned positive reviews when it launched in September 2010.
Soon thereafter, however, a group of broadcast networks, affiliates, and content owners sued ivi in federal court for copyright infringement. The court agreed with the broadcasters and ordered ivi to cease operations pending the resolution of the lawsuit.
ivi appealed this ruling to the 2nd Circuit, which affirmed the trial court’s preliminary injunction earlier this month in an opinion (PDF) by Judge Denny Chin. The appeals court held as follows:
- The rights holders would likely prevail on their claim that ivi infringed on their performance rights, as ivi publicly performed their copyrighted programs without permission;
- ivi is not a “cable system” eligible for the Copyright Act’s compulsory license for broadcast retransmissions, as ivi distributes video over the Internet, rather than its own facilities;
- Allowing ivi to continue operating would likely cause irreparable harm to the rights holders, as ivi’s unauthorized distribution of copyrighted programs diminishes the works’ market value, and ivi would likely be unable to pay damages if it loses the lawsuit;
- ivi cannot be “legally harmed by the fact that it cannot continue streaming plaintiffs’ programming,” thus tipping the balance of hardships in plaintiffs’ favor;
- While the broad distribution of creative works advances the public interest, the works streamed by ivi are already widely accessible to the public.
As much as I enjoy a good statutory construction dispute, to me, the most interesting question here is whether ivi caused “irreparable harm” to rights holders.
Writing on Techdirt, Mike Masnick is skeptical of the 2nd Circuit’s holding, criticizing its “purely faith-based claims … that a service like ivi creates irreparable harm to the TV networks.” He argues that even though ivi “disrupt[s] the ‘traditional’ way that [the broadcast television] industry’s business model works … that doesn’t necessarily mean that it’s automatically diminishing the value of the original.” Citing the VCR and DVR, two technologies that disrupted traditional methods of monetizing content, Mike concludes that “[t]here’s no reason to think” ivi wouldn’t “help [content owners’] business by increasing the value of shows by making them more easily watchable by people.”
Continue reading →
That was the response of a friend currently in Rwanda who had issued a Facebook plea for someone to upload the weird “Innocence of Muslims” video to Dropbox.
“Oh, where is the stupid internet in Rwanda?????” she exclaimed.
In typical snark, I had asked, “What do you connect to Dropbox with? Tin-can on string?”
She actually has Internet access, but she finds YouTube so much less reliable than other platforms that she asks friends to upload YouTube videos elsewhere.
I anecdotally find YouTube videos to be clunky downloads compared to others. Quite naturally, I watch fewer videos on YouTube and more on other platforms. I don’t know, but guess, that Google has made some decision to economize on video downloads—a high percentage of people probably watch only the first third of any video, so why send them the whole thing right away?—and that its imperfect implementation has me watching the spinning “pause” wheel (or playing “snake”) routinely when I think a YouTube offering would be interesting.
Would the Google of five years have allowed that? It’s well known that Google recognizes speed as an important elements of quality service on the Internet.
And this is why antitrust action against Google is unwarranted. When companies get big, they lose their edge, as I’m guessing Google is losing its edge in video service. This opens the door to competitors as part of natural economic processes.
Just the other week, I signed up with Media.net and I’ll soon be running tests on whether it gets better results for me on WashingtonWatch.com than Google AdSense. So far so good. A human customer service representative navigated me through the (simple) process of opening an account and getting their ad code.
These are anecdotes suggesting Google’s competitive vulnerability. But you can get a more systematic airing of views at TechFreedom’s event September 28th: “Should the FTC Sue Google Over Search?“
Ryan Radia, associate director of technology studies at the Competitive Enterprise Institute, discusses the amicus brief he helped author in the case of Verizon v. Federal Communications Commission now before the D.C. Circuit Court of Appeals. Radia analyzes the case, which will determine the fate of the FCC’s net neutrality rule. While Verizon is arguing that the FCC does not have the authority to issue suce rules, Radia says that the constitutional implications of the net neutrality rule are more important. He explains that the amicus brief outlines both First and Fifth Amendment arguments against the rule, stating that net neutrality impinges on the speech of Internet service providers and constitutes an illegal taking of their private property.
[Flash 9 is required to listen to audio.]
Download
Related Links