October 2009

Surprisingly Free Conversations The new episode of Surprisingly Free Conversations is up and it features Michael S. Sawyer, a fellow at the Berkeley Center for Law and Technology, discussing the impact of the DMCA on user-generated content. The discussion also turns to the principle of fair use and competing solutions for dealing with copyright infringements on user-generated content sites. You can listen to the podcast on the site or subscribe in iTunes. While you’re at it, check out our last episode, featuring TLF alum Tim Lee discussing bottom-up processes, the innovators dilemma, the link economy, and the future of newspapers.

Not So Fast, Cloud

by on October 12, 2009 · 7 comments

The cloud won’t grow quite the way Berin notes, at least not if I can help it.

As the ongoing T-Mobile Sidekick failure shows, if you release your data to “the cloud,” you give up control. In this case, giving up control means giving up your data. (Speculation about what happened is here.)

When you combine that with the privacy consequences of delivering your data to god-knows-where, and to service providers that have heaven-knows-what data-sharing agreements with governments and corporations, the cloud looks a lot more gray.

There will always be a place for remote storage and services—indeed, they will remain an important part of the mix—but I think that everyone should ultimately have their own storage and servers. (Hey, we did it with PCs! Why not?) Our thoroughly distributed computing, storage, and processing infrastructure should be backed up to—well, not the cloud—to specific, identifiable, legally liable and responsible service providers.

Ok, I didn’t say anything last month when Jerry—albeit with some caveats—cited that FCC stat about how 88 percent of zip codes have four or more broadband providers. But now I see my friend Peter Suderman relying on the same figure over at Reason. And friends don’t let friends use FCC broadband data.

First, since a zip code is considered to be “served” by a provider if it has a single subscriber in that area, this is not a terribly helpful measure of competition, which is a function of what you can get at any given household. More importantly, the definition of “broadband” here is a blazing 200 Kbps unidirectional—or about 1/20th the average broadband connection speed in the U.K., itself the slowpoke of Europe. A third of the connections they’re calling “broadband” don’t even reach that pathetic speed bidirectional. Of the 2/3 that do manage to reach that speed both ways, almost half are slower than 2.5 Mbps in the faster direction.

Mobile companies are by far the most common “broadband” providers in their sample, with “at least some presence” in 99% of their zip codes, so at least one of those four  providers is almost certainly a mobile company.  It’s probably a lot more than that: In only 63% of zip codes were both cable and ADSL subscribers reported—and remember, that doesn’t even tell us whether any households actually had even the choice between a cable and an ADSL provider. So we can see how easily you get to four providers under this scheme: You just have to live in a zip code with, let’s say,  an incumbent cable company offering what passes for “real” broadband in the U.S., plus even spotty coverage under a 3G network (average downstream speed 901–1,940 Kbps, depending on your provider) , and a couple of conventional cellular carriers with Edge or EVDO coverage that just squeaks over the 200 Kbps bar. Congratulations, you’re a lucky beneficiary of U.S. “broadband competition.”  Woo.

Look, I think the average person in this country understand that their broadband options are pretty crap, and there’s not much percentage in telling them to ignore their lying eyes and check out some dubious numbers. If the argument against net neutrality depends on the idea that we currently have robust competition in real broadband, well, the argument is in a lot of trouble. What I find much more compelling is the idea that, with 4G rollouts on the horizon, we may actually get adequate broadband competition in the near-to-medium term, and might want to be wary about rushing into regulatory solutions that not only assume the status quo, but could plausibly help entrench it.

Addendum: That Pew survey I cited in the previous post did ask individual home broadband subscribers how many providers they had to choose from.  Obviously, that sample excludes people without any broadband access, but 21% (and 30% of rural users) said they only had a single provider, and only a quarter of those who had multiple providers said they had as many as four. Since average prices appeared to be lower the more competition was present, and assuming ceteris paribus you get higher adoption when prices are lower, this sounds likely to overstate the actual degree of choice Americans enjoy.

Class and Gov 2.0

by on October 12, 2009 · 29 comments

Rose Afriyie from Feministing wants to know why, amid all the enthusiastic talk of “Gov 2.0” under Obama, we’re not hearing about the  “digital divide,” about which there used to be so much tearing of hair and rending of garments:

I, for one, am a little concerned that in all this technology talk, particularly with respect to government agencies moving information online, not a word was mentioned about the Digital Divide. It’s not news that low-income people of color and women are devastatingly impacted by decreased access to technology. But as states and state agencies experience budget constraints, activists must keep an eye out to insure that these creative measures are sensitive to the needs of these communities.

Data consolidation is one thing, but how will “automated government services” impact consumers? More specifically, how much computer literacy will be needed to interact with these agencies? I’m not saying that agencies should stay in the Stone Age per se; But, before these agencies pull a George Jetson, they should assess the technological literacy of their communities through surveys or other methods. Also, they should use some of the savings from implementing these new high tech programs to invest in more free Wi-Fi hotspot locations and free technology education workshops–that run at night and provide childcare.

broadbandadoptionOne reason might be that it’s hard to imagine the growth curve for Internet adoption being a whole lot steeper than it is. According to the most recent Pew survey, the percentage of adults in households with home broadband rose from 55% to 63% over the past year. As with adoption of all new technologies, lower income households are behind—but that just means they’re lagging by a few years on the same rapid growth trajectory. For households with annual incomes under $20,000, home broadband rose from 25% of households to 35% in 2009. That’s pretty similar to the curve we saw with television adoption, and if the trend from here roughly tracks TV, we should expect something damn near ubiquity within about five years, which is how long I’d expect it would take to get the kinks worked out of all these online government services anyway. And obviously, that doesn’t count all the people who don’t have broadband at home but have some other access—via work, friends, family, libraries, or cafes.  (Also, the government just pumped $4 billion into “stimulating” broadband growth, with another #3 billion in the offing—although that money is, I think unwisely, focused on building pipe to underserved but sparsely populated rural areas rather than improving service and increasing uptake in cities.) All of which is to say, it would be mindbogglingly shortsighted to hold off on on rolling out Gov 2.0 services just because a target community might have low rates of Internet use today. Continue reading →

This Microsoft-funded study projects that, by 2013, cloud computing will have added $800 billion in net new business revenues for the 52 countries surveyed (over 2009 levels). The growing economic importance of the cloud is likely to increase pressure for government involvement. As President Reagan said: “Government’s view of the economy could be summed up in a few short phrases: If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it.”

IT Spending Forecast

On October 1 I attended a panel discussion on the use of technology to restrict the illegal transfer of copyright-protected content online. The panel talked about a new French law requiring ISPs to block users who had “three strikes” against them for illegal transfers, recent developments in watermarking and fingerprinting, and the future of fair use.

I blog further at Convergences and also supply sketches for your amusement. For it is important that you be amused.

FCC Chairman Julius Genachowski suggested at an FCC field hearing this week that the federal government might create its own “version of iTunes.” Multichannel News reports:
Itunes Store

The chairman asked panelists to think about the value of a clearinghouse where best practices could be shared. He suggested that might be a way to spur the spin-off of public-sector apps from private sector initiatives and to prevent reinventing the wheel, rather than tapping into what is already being done. There is not a lot of shared info out there, he said.

If all we’re talking about is a clearinghouse that provides easy access to apps for government-developed apps, Google Code or SourceForge may be a better model than iTunes—though perhaps without the instant name recognition by ordinary consumers. Like SourceForge, Google Code allows hosting and management of open source projects, including Google’s own products. iTunes, by contrast, essentially offers consumers finished apps. Also, iTunes is a stand-alone piece of software, of which the Apps Store is  just one part, while I can’t imagine why Genachowski’s “store” need be anything more than a website.

Whatever the analogy, such a “store” could well be a valuable tool for sharing the benefits of software development by government employees, both with the private sector and among federal agencies as well as state, local and even foreign governments. But what, exactly, Genachowski had in mind for the store remains awfully vague: Multichannel News mentions, as examples, “applications that do everything from monitoring heart rates and blood sugar to checking for greenhouse gas levels.” If the idea ever goes anywhere, it should be based on two principles:

  1. All apps should be open source and available to all users to use as they see fit.
  2. The store should be limited to apps developed by government employees to meet the needs of government agencies.

Continue reading →

The smell of high-tech regulation is increasingly in the air these days and many lawmakers and some activist groups now have the mobile marketplace in their regulatory cross-hairs. Critics make a variety of claims about the wireless market supposedly lacking competition, choice, innovation, or reasonable pricing. Consequently, they want to wrap America’s wireless sector in a sea of red tape.   Two important new studies thoroughly debunk these assertions and set the record straight regarding the state of wireless competition and innovation in the U.S. today. These reports are must-reading for Washington policymakers and FCC officials who are currently contemplating regulatory action.

First, Gerald Faulhaber and Dave Farber have a new report out entitled “Innovation in the Wireless Ecosystem: A Customer-Centric Framework.”  Here’s what Faulhaber and Farber find:

the three segments of the wireless marketplace (applications, devices, and core network) have exhibited very substantial innovation and investment since its inception. Perhaps more interesting, innovation in each segment is highly dependent upon innovation in the other segments. For example, new applications depend upon both advances in device hardware capabilities and advances in spectral efficiency of the core network to provide the network capacity to serve those applications. Further, we find that the three segments of the industry are also highly competitive. There are many players in each segment, each of which aggressively seeks out customers through new technology and new business methods. The results of this competition are manifest: (i) firms are driven to innovate and invest in order to win in the competitive marketplace; (ii) new business models have emerged that give customers more choice; and (iii) firms have opened new areas such as wireless broadband and laptop wireless in order to expand their strategic options.

They continue on to address the policy issues in play here and discuss the “consumer-centric” approach they recommend that the FCC adopt: Continue reading →

I really enjoyed my Second Life appearance on “Government’s Place in Virtual Worlds and Online Communities,” which was hosted by Metanomics.  You can watch the entire segment on the Metanomics site.  But the folks at Metanomics have also posted 6 clips from the show at YouTube that highlight some of the topics we discussed.  Here’s the list of clips and the videos:

Part 1: Are the Feds about to Regulate Second Life & Virtual Worlds?

Continue reading →

I debated PK’s Art Brodsky last week about net neutrality on the international news channel, RussiaToday. Here are a few of my key points of disagreement with Art:

  1. The glittering generality of “Neutrality,” once enshrined in law for one layer of the Internet will be extended, sooner or later, to other layers. As Adam and I have warned, “the same rationale would apply equally to any circumstance in which access to a communications platform is supposedly limited to a few ‘gatekeepers.'” We’re already seeing this with fights over application neutrality and device neutrality, and calls for search neutrality are growing.
  2. Art insists that antitrust suits work too slowly. But he doesn’t address the basic question of what standard should govern network management. Should it be “neutrality uber alles” or, if we’re going to regulate in fashion, why shouldn’t we ask what’s good for consumers—the standard proposed by PFF’s 2005 Digital Age Communications Act (DACA)? Neutrality isn’t always best!
  3. Common carriage regulation didn’t work well for railroads (contrary to popular myth) and it worked even less well for communications media, retarding the development of new services like faxes, Internet services and cell phones. Regulating broadband providers the same way will work even more poorly because they aren’t just “big dumb pipes” providing a plain vanilla service and incapable of innovation that can benefit consumers.