On Forbes today, I have a long article on the progress being made to build gigabit Internet testbeds in the U.S., particularly by Gig.U.
Gig.U is a consortium of research universities and their surrounding communities created a year ago by Blair Levin, an Aspen Institute Fellow and, recently, the principal architect of the FCC’s National Broadband Plan. Its goal is to work with private companies to build ultra high-speed broadband networks with sustainable business models .
Gig.U, along with Google Fiber’s Kansas City project and the White House’s recently-announced US Ignite project, spring from similar origins and have similar goals. Their general belief is that by building ultra high-speed broadband in selected communities, consumers, developers, network operators and investors will get a clear sense of the true value of Internet speeds that are 100 times as fast as those available today through high-speed cable-based networks. And then go build a lot more of them.
Google Fiber, for example, announced last week that it would be offering fully-symmetrical 1 Gbps connections in Kansas City, perhaps as soon as next year. (By comparison, my home broadband service from Xfinity is 10 Mbps download and considerably slower going up.)
US Ignite is encouraging public-private partnerships to build demonstration applications that could take advantage of next generation networks and near-universal adoption. It is also looking at the most obvious regulatory impediments at the federal level that make fiber deployments unnecessarily complicated, painfully slow, and unduly expensive.
I think these projects are encouraging signs of native entrepreneurship focused on solving a worrisome problem: the U.S. is nearing a dangerous stalemate in its communications infrastructure. We have the technology and scale necessary to replace much of our legacy wireline phone networks with native IP broadband. Right now, ultra high-speed broadband is technically possible by running fiber to the home. Indeed, Verizon’s FiOS network currently delivers 300 Mbps broadband and is available to some 15 million homes.
Continue reading →
On CNET today, I’ve posted a long critique of the recent report by the President’s Council of Advisors on Science and Technology (PCAST) urging the White House to reverse course on a two-year old order to free up more spectrum for mobile users.
In 2010, soon after the FCC’s National Broadband Plan raised alarms about the need for more spectrum for an explosion in mobile broadband use, President Obama issued a Memorandum ordering federal agencies to free up as much as 500 MHz. of radio frequencies currently assigned to them.
After a great deal of dawdling, the National Telecommunications and Information Administration, which oversees spectrum assignments within the federal government, issued a report earlier this year that seemed to offer progress. 95 MHz. of very attractive spectrum could in fact be cleared in the ten years called for by the White House.
But reading between the lines, it was clear that the 20 agencies involved in the plan had no serious intention of cooperating. Their cost estimates for relocation (which were simply reported by NTIA without any indication of how they’d been arrived at or even whether NTIA had been given any details) appeared to be based on an amount that would make any move economically impossible. Continue reading →
During the 1970’s, I remember a bumper sticker that summed up the prevailing anti-colonial attitude that had developed during the late 1960’s: “U.S. Out of North America.”
That sentiment reflects nicely my activities this week, which include three articles decrying efforts by regulators to oversee key aspects of the Internet economy. Of course their intentions—at least publicly—are always good. But even with the right idea, the unintended negative consequences always overwhelm the benefits by a wide margin.
Governments are just too slow to respond to the pace of change of innovations in information technology. Nothing will fix that. So better just to leave well enough alone and intercede only when genuine consumer harm is occurring. And provable.
The articles cover the spectrum from state (California), federal (FCC) and international (ITU) regulators and a wide range of truly bad ideas, from the desire of California’s Public Utilities Commission to “protect” consumers of VoIP services, to the FCC’s latest effort to elbow its way into regulating broadband Internet access at the middle milel, to a proposal from European telcos to have the U.N. implement a tariff system on Internet traffic originating from the U.S.
Continue reading →
(Adapted from Bloomberg BNA Daily Report for Executives, May 16th, 2012.)
Two years ago, the Federal Communications Commission’s National Broadband Plan raised alarms about the future of mobile broadband. Given unprecedented increases in consumer demand for new devices and new services, the agency said, network operators would need far more radio frequency assigned to them, and soon. Without additional spectrum, the report noted ominously, mobile networks could grind to a halt, hitting a wall as soon as 2015.
That’s one reason President Obama used last year’s State of the Union address to renew calls for the FCC and the National Telecommunications and Information Administration (NTIA) to take bold action, and to do so quickly. The White House, after all, had set an ambitious goal of making mobile broadband available to 98 percent of all Americans by 2016. To support that objective, the president told the agencies to identify quickly an additional 500 MHz of spectrum for mobile networks.
By auctioning that spectrum to network operators, the president noted, the deficit could be reduced by nearly $10 billion. That way, the Internet economy could not only be accelerated, but taxpayers would actually save money in the process.
A good plan. So how is it working out?
Unfortunately, the short answer is: Not well. Speaking this week at the annual meeting of the mobile trade group CTIA, FCC Chairman Julius Genachowski had to acknowledge the sad truth: “the overall amount of spectrum available has not changed, except for steps we’re taking to
add new spectrum on the market.” Continue reading →
Frederick Jackson Turner (1861-1932)
On Fierce Mobile IT, I’ve posted a detailed analysis of the NTIA’s recent report on government spectrum holdings in the 1755-1850 MHz. range and the possibility of freeing up some or all of it for mobile broadband users.
The report follows from a 2010 White House directive issued shortly after the FCC’s National Broadband Plan was published, in which the FCC raised the alarm of an imminent “spectrum crunch” for mobile users.
By the FCC’s estimates, mobile broadband will need an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020, in order to satisfy increases in demand that have only amped up since the report was issued. So far, only a small amount of additional spectrum has been allocated. Increasingly, the FCC appears rudderless in efforts to supply the rest, and to do so in time. Continue reading →
On CNET today, I have a longish post on the FCC’s continued machinations over LightSquared and Dish Networks respective efforts to use existing satellite spectrum to build terrestrial mobile broadband networks. Both companies plan to build 4G LTE networks; LightSquared has already spent $4 billion in build-out for its network, which it plans to offer wholesale.
After first granting and then, a year later, revoking LightSquared’s waiver to repurpose its satellite spectrum, the agency has taken a more conservative (albeit slower) course with Dish. Yesterday, the agency initiated a Notice of Proposed Rulemaking that would, if adopted, assign flexible use rights to about 40 Mhz. of MSS spectrum licensed to Dish.
Current allocations of spectrum have little to do with the technical characteristics of different bands. That existing licenses limit Dish and LightSquared to satellite applications, for example, is simply an artifact of more-or-less random carve-outs to the absurdly complicated spectrum map managed by the agency since 1934. Advances in technology makes it possible to successfully use many different bands for many different purposes.
But the legacy of the FCC’s command-and-control model for allocations to favor “new” services (new, that is, until they are made obsolete in later years or decades) and shape competition to its changing whims is a confusing and unnecessary pile-up of limitations and conditions that severely and artificially limit the ways in which spectrum can be redeployed as technology and consumer demands change. Today, the FCC sits squarely in the middle of each of over 50,000 licenses, a huge bottleneck that is making the imminent spectrum crisis in mobile broadband even worse. Continue reading →
- Ceci c’est un meme.
On Forbes today, I look at the phenomenon of memes in the legal and economic context, using my now notorious “Best Buy” post as an example. Along the way, I talk antitrust, copyright, trademark, network effects, Robert Metcalfe and Ronald Coase.
It’s now been a month and a half since I wrote that electronics retailer Best Buy was going out of business…gradually. The post, a preview of an article and future book that I’ve been researching on-and-off for the last year, continues to have a life of its own.
Commentary about the post has appeared in online and offline publications, including The Financial Times, The Wall Street Journal, The New York Times, TechCrunch, Slashdot, MetaFilter, Reddit, The Huffington Post, The Motley Fool, and CNN. Some of these articles generated hundreds of user comments, in addition to those that appeared here at Forbes.
Continue reading →
On Forbes yesterday, I posted a detailed analysis of the successful (so far) fight to block quick passage of the Protect-IP Act (PIPA) and the Stop Online Piracy Act (SOPA). (See “Who Really Stopped SOPA, and Why?“) I’m delighted that the article, despite its length, has gotten such positive response.
As regular readers know, I’ve been following these bills closely from the beginning, and made several trips to Capitol Hill to urge lawmakers to think more carefully about some of the more half-baked provisions.
But beyond traditional advocacy–of which there was a great deal–something remarkable happened in the last several months. A new, self-organizing protest movement emerged on the Internet, using social news and social networking tools including Reddit, Tumblr, Facebook and Twitter to stage virtual teach-ins, sit-ins, boycotts, and other protests. Continue reading →
After three years of politicking, it now looks like Congress may actually give the FCC authority to conduct incentive auctions for mobile spectrum, and soon. That, at least, is what the FCC seems to think.
At CES last week, FCC Chairman Julius Genachowski largely repeated the speech he has now given three years in a row. But there was a subtle twist this time, one echoed by comments from Wireless Bureau Chief Rick Kaplan at a separate panel.
Instead of simply warning of a spectrum crunch and touting the benefits of the incentive auction idea, the Chairman took aim at a House Republican bill that would authorize the auctions but limit the agency’s “flexibility” in designing and conducting them. “My message on incentive auctions today is simple,” he said, “we need to get it done now, and we need to get it done right.” Continue reading →
I’ve written several articles in the last few weeks critical of the dangerously unprincipled turn at the Federal Communications Commission toward a quixotic, political agenda. But as I reflect more broadly on the agency’s behavior over the last few years, I find something deeper and even more disturbing is at work. The agency’s unreconstructed view of communications, embedded deep in the Communications Act and codified in every one of hundreds of color changes on the spectrum map, has become dangerously anachronistic.
The FCC is required by law to see separate communications technologies delivering specific kinds of content over incompatible channels requiring distinct bands of protected spectrum. But that world ceased to exist, and it’s not coming back. It is as if regulators from the Victorian Age were deciding the future of communications in the 21st century. The FCC is moving from rogue to steampunk.
With the unprecedented release of the staff’s draft report on the AT&T/T-Mobile merger, a turning point seems to have been reached. I wrote on CNET (see “FCC: Ready for Reform Yet?”) that the clumsy decision to release the draft report without the Commissioners having reviewed or voted on it, for a deal that had been withdrawn, was at the very least ill-timed, coming in the midst of Congressional debate on reforming the agency. Pending bills in the House and Senate, for example, are especially critical of how the agency has recently handled its reports, records, and merger reviews. And each new draft of a spectrum auction bill expresses increased concern about giving the agency “flexibility” to define conditions and terms for the auctions.
The release of the draft report, which edges the independent agency that much closer to doing the unconstitutional bidding not of Congress but the White House, won’t help the agency convince anyone that it can be trusted with any new powers. Let alone the novel authority to hold voluntary incentive auctions to free up underutilized broadcast spectrum.
Continue reading →