After working my way through the Executive Summary of the Federal Communications Commission’s (FCC) National Broadband Plan, there are a number of things I find troubling that I will get to in a subsequent post. But here’s the thing about “The Plan” that I found most surprising — even audacious — in its arrogance: The FCC wants us to believe the whole scheme is costless. The agency bases this astonishing claim on the following assumptions:
Given the plan’s goal of freeing 500 megahertz of spectrum, future wireless auctions mean the overall plan will be revenue neutral, if not revenue positive. The vast majority of recommendations do not require new government funding; rather, they seek to drive improvements in the government efficiency, streamline processes and encourage private activity to promote consumer welfare and national priorities. The funding requests relate to public safety, deployment to unserved areas and adoption efforts. If the spectrum auction recommendations are implemented, the plan is likely to offset the potential costs.
Let me translate: ”
Pay no attention to all the bills we are racking up, because spectrum revenues shall set us free!”
Perhaps that logic works in the reality-free zone we call the Beltway, but back in the real world this simply doesn’t add up. Regardless of how well-intentioned any of these goals and proposals may be, it should be equally clear that there is no free lunch, even with spectrum auction proceeds fueling the high-tech gravy train. The proposals and programs the FCC sets forth will impose serious economic costs that shouldn’t be so casually dismissed, especially using the weak reasoning that “improvements in the government efficiency” will magically manifest themselves thanks to massive new government intervention in the field. (If you think you’ve heard this one before, you have. See: The current health care debate.)
Moreover, if everything really does hang on the promise of spectrum auction revenues covering the broadband spending binge, well, bad news: The agency is
never going to bring in enough to cover what they’ve proposed here. The reason is simple: Most of the spectrum they want to grab is currently occupied by someone else! Continue reading →
If you haven’t been paying attention to the Comcast-NBC Universal merger, here’s a reason to: A good fight has broken out!
It starts with Mark Cooper, Director of Research at the Consumer Federation of America, who testified against the merger to the House Commerce Committee’s Subcommittee on Communications, Technology, and the Internet on behalf of CFA, Free Press, and Consumers Union.
The merger has so many anti-competitive, anti-consumer, and anti-social effects that it cannot be fixed,” says Cooper.
Cato Adjunct Scholar Richard Epstein lays into Cooper’s testimony with aplomb: ”Dr. Cooper has achieved a rare feat. The evidence that he presents against this proposed merger suffices to explain emphatically why it ought to be approved.”
And in a second commentary, Epstein ladles out another helping of humble pie to Cooper, concluding:
The cumbersome Soviet-style review process that Mr. Cooper advocates does no good for the consumers who he purports to represent. It only shows how far out of touch he is with the basics of antitrust theory as they relate to the particulars of the telecommunication market.
Maybe Cooper will have a rejoinder. But until then, I’ll just note that the best fights are the ones that your guy wins.
Details are starting to trickle out about the Federal Communications Commission’s (FCC) National Broadband Plan, which is due out tomorrow. Someone just posted the Executive Summary here. I haven’t had a chance to go through it all yet, but I’m looking forward to learning more about what the agency’s plans are on this front.
On Friday (again, before seeing any details), I offered some fairly mushy comments about the idea of national “plan” to the gang over at the excellent new site, FiveQsOnTech.com. The site has a great format: Five questions on technology and policy asked and answered (usually on tape) by technology policy wonks. I’m honored to be among the first couple of experts featured on the site, along with Markham Erickson of the Open Internet Coalition and Rob Atkinson of ITIF.
In the first 3 minutes of this second of the two videos I appear in, I offered some thoughts about “The Plan”:
http://www.youtube.com/v/sBRL2RfdMk4&rel=0&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1
I somehow missed this excellent ITIF paper by Robert D. Atkinson and George Ou when it came out at this point last year, but George has just dusted it off, made a couple of updates, and re-posted it over at the Digital Society blog. Worth reading. It touches on a lot of the same case studies I have been documenting in my ongoing series, “Problems in Public Utility Paradise.” In particular, it focuses on the UTOPIA and iProvo fiascos out in Utah. Here’s a key takeaway from those case studies:
The lessons learned in Utah is that projected uptake models and deployment plans don’t always come to fruition, and when that happens the consequence is failure. For UTOPIA, the project was projected to reach 35% uptake rates by February 2008 but the reality was less than 17% uptake. UTOPIA had also hoped for 17% uptake from lucrative business customers but the reality was only 2 to 3 percent. Provo County’s iProvo was hoping for 10,000 subscribers by July 2006 with the assumption that 75% of those customers would subscribe to lucrative triple play services, but the reality was 10,000 customers in late 2007 with only 17% of those customers subscribing to triple play. Many consumers were quite happy to subscribe to existing broadband cable or telecom providers. The consistent theme in Utah was an overestimation of the uptake rates and the underestimation of competition from incumbent cable operator Comcast and telecom operator Qwest which led to consistent underperformance.
Ouch. For more details, see this old essay of mine about UTOPIA from 2008, and this piece from last Sept about iProvo. Not a pretty picture. As I say every time I pen a piece about the latest muni failure du jour, these case studies should serve as a cautionary tale about the dangers of grandiose, centrally planned broadband schemes. There is no such thing as a free lunch. Network-building is hard, and politicians usually aren’t that good at doing it.
I published an opinion piece today for CNET arguing against recent calls to reclassify broadband Internet as a “telecommunications service” under Title II of the Communications Act.
The push to do so comes as supporters of the FCC’s proposed Net Neutrality rules fear that the agency’s authority to adopt them under its so-called “ancillary jurisdiction” won’t fly in the courts. In January, the U.S. Court of Appeals for the D.C. Circuit heard arguments in Comcast’s appeal of sanctions levied against the cable company for violations of the neutrality principles (not yet adopted under a formal rulemaking). The three-judge panel expressed considerable doubt about the FCC’s jurisdiction in issuing the sanctions during oral arguments. Only the published opinion (forthcoming) will matter, of course, but anxiety is growing.
Solving the Net Neutrality jurisdiction problem with a return to Title II regulation is a staggeringly bad idea, and a counter-productive one at that. My article describes the parallel developments in “telecommunications services” and the largely unregulated “information services” (aka Title I) since the 1996 Communications Act, making the point that life for consumers has been far more exciting—and has generated far more wealth–under the latter than the former.
Under Title I, in short, we’ve had the Internet revolution. Under Title II, we’ve had the decline and fall of basic wireline phone service, boom and bust in the arbitraging competitive local exchange market, massive fraud in the bloated e-Rate program, and the continued corruption of local licensing authorities holding applications hostage for legal and illegal bribes.
Continue reading →
Should ISPs be barred under net neutrality from discriminating against illegal content? Not according to the FCC’s draft net neutrality rule, which defines efforts by ISPs to curb the “transfer of unlawful content” as reasonable network management. This exemption is meant to ensure providers have the freedom to filter or block unlawful content like malicious traffic, obscene files, and copyright-infringing data.
EFF and Public Knowledge (PK), both strong advocates of net neutrality, are not happy about the copyright infringement exemption. The groups have urged the FCC to reconsider what they describe as the “copyright loophole,” arguing that copyright filters amount to “poorly designed fishing nets.”
EFF’s and PK’s concerns about copyright filtering aren’t unreasonable. While filtering technology has come a long way over the last few years, it remains a fairly crude instrument for curbing piracy and suffers from false positives. That’s because it’s remarkably difficult to accurately distinguish between unauthorized copyrighted works and similar non-infringing files. And because filters generally flag unauthorized copies on an automated basis without human intervention, even when filters get it right, they often disrupt legal, non-infringing uses of copyrighted material like fair use.
Despite copyright filtering technology’s imperfections, however, outlawing it is the wrong approach. At its core, ISP copyright filtering represents a purely private, voluntary method of dealing with the great intellectual property challenge. This is exactly the sort of approach advocates of limited government should embrace. As Adam and Wayne argued back in 2001:
To lessen the reliance on traditional copyright protections, policymakers should ensure that government regulations don’t stand in the way of private efforts to protect intellectual property.
Continue reading →
The way Ben Kunz puts it in a new Business Week article, “Each device contains its own widening universe of services and applications, many delivered via the Internet. They are designed to keep you wedded to a particular company’s ecosystem and set of products.”
I like Ben’s article a lot because it recognizes that “
walling off” and a “widening universe” are not mutually exclusive. If only policymakers and regulators acknowledged that. They must know it, but admitting it means acknowledging their limited relevance to consumer well-being and a need to step aside. So they feign ignorance.
Many claim to worry about the rise of proprietary services (I, as you can probably tell, often doubt their sincerity) but I’ve always regarded a “Splinternet” as a good thing that means more, not less, communications wealth. I first wrote about this in Forbes in 2000 when everyone was fighting over spam, privacy, content regulation, porn and marketing to kids.
Increasing wealth means a copy-and-paste world for content across networks, and it means businesses will benefit from presence across many of tomorrow’s networks, generating more value for future generations of consumers and investors. We won’t likely talk of an “Internet” with a capital-“I” and a reverent tremble the way we do now, because what matters is not the Internet as it happens to look right now, but underlying Internet
technology that can just as easily erupt everywhere else, too.
Meanwhile, new application, device and content competition within
and across networks disciplines the market process and “regulates” things far better than the FCC can. Yet the FCC’s very function is to administer or artificially direct proprietary business models, which it must continue to attempt to do (and as it pleads for assistance in doing in the net neutrality rulemaking) if it is going to remain relevant. I described the urgency of stopping the agency’s campaign recently in “Splinternets and cyberspaces vs. net neutrality,” and also in the January 2010 comments to the FCC on net neutrality.
Continue reading →
by Adam Thierer & Berin Szoka
We’re hoping that the Government Accountability Office (GAO) has made some sort of mistake, because it’s hard to believe its latest findings about the paperwork burden generated by Federal Communications Commission (FCC) regulatory activity. In late January, the GAO released a report on “Information Collection and Management at the Federal Communications Commission” (GAO-10-249), which examined information collection, management, and reporting practices at the FCC. The GAO noted that the FCC gathers information through 413 collection instruments, which include things like: (1) required company filings, such as the ownership of television stations; (2) applications for FCC licenses; (3) consumer complaints; (4) company financial and accounting performance; and (5) a variety of other issues, such as an annual survey of cable operators. (Note: This does not include filings and responses done pursuant to other FCC NOIs or NPRMs.)
Regardless, the FCC told the GAO that it receives nearly
385 million responses with an estimated 57 million burden hours associated with the 413 collection instruments. A “burden hour” is defined under the Paperwork Reduction Act as “the time, effort, or financial resources expended by persons to generate, maintain, or provide information to a federal agency.” And the FCC is generating 57 million of ‘em! Even though we are frequently critical of the agency, these numbers are still hard to fathom. Perhaps the GAO has made some sort of mistake here. But here’s what really concerns us if they haven’t made a mistake. Continue reading →
Progress Snapshot 6.6, The Progress & Freedom Foundation (PDF)
Mobile broadband speeds (at the “core” of wireless networks) are about to skyrocket—and revolutionize what we can do on-the-go online (at the “edge”). Consider four recent stories:
- Networks: MobileCrunch notes that Verizon will begin offering 4G mobile broadband service (using Long Term Evolution or LTE) “in up to 60 markets by mid-2012″—at an estimated 5-12 Mbps down and 2-5 Mbps up, LTE would be faster than most wired broadband service.
- Devices: Sprint plans to launch its first 4G phone (using WiMax, a competing standard to LTE) this summer.
- Applications: Google has finally released Google Earth for the Nexus One smartphone on T-Mobile, the first to run Google’s Android 2.1 operating system.
- Content: In November, Google announced that YouTube would begin offering high-definition 1080p video, including on mobile devices.
While the Nexus One may be the first Android phone with a processor powerful enough to crunch the visual awesomeness that is Google Earth, such applications will still chug along on even the best of today’s 3G wireless networks. But combine the ongoing increases in mobile device processing power made possible by Moore’s Law with similar innovation in broadband infrastructure, and everything changes: You can run hugely data-intensive apps that require real-time streaming, from driving directions with all the rich imagery of Google Earth to mobile videoconferencing to virtual world experiences that rival today’s desktop versions to streaming 1080p high-definition video (3.7+ Mbps) to… well, if I knew, I’d be in Silicon Valley launching a next-gen mobile start-up!
This interconnection of infrastructure, devices and applications should remind us that broadband isn’t just about “big dumb pipes”—especially in the mobile environment, where bandwidth is far more scarce (even in 4G) due to spectrum constraints. Network congestion can spoil even the best devices on the best networks. Just ask users in New York City, where AT&T has apparently just stopped selling the iPhone online in order to try to relieve AT&T’s over-taxed network under the staggering bandwidth demands of Williamsburg hipsters, Latter-Day Beatniks from the Village, Chelsea boys, and Upper West Side Charlotte Yorks all streaming an infinite plethora of YouTube videos and so on. Continue reading →
I recently wrote an op-ed for the American Legislative Exchange Council’s Inside ALEC publication. It’s decidedly non-technical, as most correspondance with a majority in the legislative branch must be. In my dealings with those in state government positions, it seems that only in the last few months have many of them become aware of the FCC’s Net Neutrality proposals — or even the issue itself. I don’t blame them. State legislators are often more concerned with local issues such as solving their budget deficits or finding funding for critical government operations.
But it’s important that they also keep an eye on what’s happening in “the other Washington,” (as we Washington state-ers like to call it) as the policies from Congress, the Administration and federal agencies trickle down to affect each and every one of us.
The text of the op-ed is after the break.
Continue reading →