March 2010

Michiko Kakutani has a very interesting essay in the New York Times entitled, “Texts Without Contexts,” which does a nice job running through the differences between Internet optimists and pessimists, a topic I’ve spent a great deal of time writing about here. (See: “Are You An Internet Optimist or Pessimist? The Great Debate over Technology’s Impact on Society.”) She surveys many of the books I’ve reviewed and discussed here before by authors such as Neil Postman, Nick Carr, Cass Sunstein, Andrew Keen, Mark Helprin, Jaron Lanier, and others. She notes:

These new books share a concern with how digital media are reshaping our political and social landscape, molding art and entertainment, even affecting the methodology of scholarship and research. They examine the consequences of the fragmentation of data that the Web produces, as news articles, novels and record albums are broken down into bits and bytes; the growing emphasis on immediacy and real-time responses; the rising tide of data and information that permeates our lives; and the emphasis that blogging and partisan political Web sites place on subjectivity.

At the same time it’s clear that technology and the mechanisms of the Web have been accelerating certain trends already percolating through our culture — including the blurring of news and entertainment, a growing polarization in national politics, a deconstructionist view of literature (which emphasizes a critic’s or reader’s interpretation of a text, rather than the text’s actual content), the prominence of postmodernism in the form of mash-ups and bricolage, and a growing cultural relativism that has been advanced on the left by multiculturalists and radical feminists, who argue that history is an adjunct of identity politics, and on the right by creationists and climate-change denialists, who suggest that science is an instrument of leftist ideologues.

It’s a great debate, but a very controversial one, of course.  Anyway, go read her entire essay.

My central lament in everything I have said so far about the Federal Communications Commission’s ambitious new National Broadband Plan is that, well, it’s just too ambitious!  The agency has taken an everything-plus-the-kitchen-sink approach to the issue and the sheer scope of their imperial ambitions is breathtaking. I’ve likened it to an industrial policy for the Internet because the agency is essentially trying to centrally plan and engineer from above virtually every aspect of America’s broadband future despite its proclamation that, “Technologies, costs and consumer preferences are changing too quickly in this dynamic part of the economy to make accurate predictions.” But very little humility seems to be on display throughout the 376-page blueprint, which includes dissertations on everything from privacy to child safety issues to set-top box regulation.

And then there’s Chapter 15 on “civic engagement,” which calls for a wide variety of things to “strengthen the citizenry and its government,” and to “build a robust digital media ecosystem.” Although some of the ideas floated in the chapter are harmless enough–and some, like the call for more open and transparent government, would actually be beneficial–for the life of me I don’t understand why any of this needs to be in a plan about broadband deployment and diffusion. Particularly bizarre is the call here for Congress to create “a trust fund for digital public media,” which would fund the “production, distribution, and archiving of digital public media.” It would apparently be funded by “the revenues from a voluntary auction of spectrum licensed to public television.” (see pgs. 303-4)

Look, if the FCC wants Congress to create the equivalent of the PBS on Steroids, fine. Let’s have that debate. (In fact, I thought it was a debate that the FCC was already considering as part of its “Future of Media” effort). But why, again, is this in broadband plan? It’s a serious stretch to claim that this is somehow crucial to the task of getting more broadband out to the masses.  Moreover, should our government really be in charge of “building a robust digital media ecosystem”?  Here are a few reasons we might want to avoid having the government in the driver’s seat when it comes to charting the future course of America’s media sector.

Couple of media clips here regarding my thoughts about the FCC’s National Broadband Plan:

Also see my essays:”5 Regulatory Hot Potatoes That Could Derail the FCC National Broadband Plan,” “Will the FCC’s National Broadband Plan Really Be Costless?” and “The Best Quote in the FCC National Broadband Report.”

In the mix of yesterday’s FCC Broadband report release and today’s FTC Privacy Roundtable and Senate hearing on expanding FTC rulemaking authority, there’s a lot going on in Washington that impacts online commerce. And we heard particularly pointed comments about the future of .com at yesterday’s 25 Years of .Com Policy Impact Forum.

A panel about the Internet and privacy that highlighted how the the future of .com may be less about commerce and more about commissions – particularly the Federal Trade Commission.

Kara Swisher (D: All Things Digital) and Fred Wilson (a VC at Union Square Ventures) dug deep into online privacy issues. They decried the supposed privacy abuses of online companies, particularly by Google and Facebook. And while Kara is smart and well-informed, Fred Wilson was flippant, scattered, and skin-deep with many of his assertions—including when he accused Facebook of pulling off “the greatest privacy heist in history, and they got away with it!”

He’s referring to the changes Facebook made last December to the way users control their privacy settings (NetChoice defended Facebook’s actions on our blog). Facebook made some recommended changes based on where it sees its service going. Users (like me) could change these if they wanted. Some people complained that Facebook changed the default settings, which modified how users previously set some of their preferences.

But does changing the recommended defaults when giving users a choice constitute a “heist?” Only based on whether a user likes it or not. There certainly wasn’t any fraud or misappropriation. Or measurable consumer harm. Still, we heard from pro-regulatory privacy groups that filed a complaint urging that the FTC unleash it’s enforcement hammer.

There are legitimate debates on whether Facebook’s switch in privacy settings was clear and easy enough to understand for most users. But overblown rhetoric on privacy harm is hard to square with other concerns about  breaches, ID theft, and other abuses of data. Continue reading →

Here’s my favorite line in the FCC’s National Broadband Plan:

Technologies, costs and consumer preferences are changing too quickly in this dynamic part of the economy to make accurate predictions.” (P. 42)

I wholeheartedly agree!  But does the agency really believe what it says?  Because as I am reading through this tome, all I see is one prediction and prognostication after another.  Indeed, in the very next paragraph that follows that one the agency starts making predictions about how many homes will be served by DSL vs. cable vs. fiber years from now. And the section about set-top box regulation is chock-full of techno-crystal ball gazing regarding what the future video marketplace should look like.

Apparently the FCC thinks that it’s impossible to predict the future… except when they are the ones doing the predicting.  Oh, the hubris of it all!

Progress Snapshot 6.7, The Progress & Freedom Foundation (PDF)

This week marks a pivotal point in the history of the Internet.  Monday was the 25th anniversary of the first .COM registration—and in some ways, the beginning of the commercial Internet.  Yesterday, the Federal Communications Commission unveiled its long-awaited National Broadband Plan, which proposes ambitious subsidies to encourage broadband deployment.  On the theory that unease about online privacy may discourage broadband adoption, the Plan also calls for increased regulation of how websites collect, and use, data from consumers.

The debate over how to regulate online data use has gone on for over a decade, leading to today’s final “Roundtable” in the “Exploring Privacy” series held by the Federal Trade Commission over the last three months.  The stakes in this debate are high: Data is the lifeblood of online content and services, and consumers will ultimately bear the cost of restrictions on data use in the form of reduced advertising funding for, and innovation in, online content and services.

That’s why this week’s most important technology policy event may ultimately prove to be today’s Senate Commerce Committee hearing on Rep. Barney Frank’s “Wall Street Reform and Consumer Protection Act of 2009” (H.R. 4173), which narrowly passed the House in December without a single hearing and no real debate.  Although the sprawling (273,579 word) bill is mostly famous for creating a Consumer Financial Protection Agency, it would also, in just 613 words, “put the FTC on steroids,” in the words of Jim Miller, FTC Chairman from 1981 to 1985.  With vastly expanded powers, the FTC could impose sweeping new regulation touching virtually every sector of our economy.

The current FTC chairman, Jon Leibowitz, has made clear his determination to step up regulation of online data use, advertising, “blogola,” and child protection, just to name a few of the hot topics in Internet policy.  While the FTC will no doubt continue to push for increased statutory authority, such as the online privacy bill reportedly being drafted by House Commerce Internet Subcommittee Chairman Rick Boucher (mandating opt-in for data collection), Chairman Leibowitz may be able to implement most of his radical Internet regulatory agenda using the new powers conferred on his agency in a bill (H.R, 4173) few realize has anything to do with Internet policy. Continue reading →

The Federal Communications Commission released the full version of its National Broadband Plan yesterday — all 11+ megabytes of it. A quick read (!) of the 300+ page document reveals that the problem of broadband “availability” is not nearly as big as the numbers highlighted in the plan would lead one to believe. If you’re careful to read the caveats and the numbers in the plan that don’t get a lot of emphasis, the problem of people who lack access to broadband is quite manageable.

The plan states that 14 million Americans lack access to terrestrial broadband capable of delivering a download speed of 4 megabytes per second (mbps). Making broadband of this speed available to all Americans would cost $24 billion more than the likely revenues from sale of the service.

(To calculate the dollar figure, the report’s authors estimated the stream of future costs and revenues from extending 4 mbps broadband to places where it does not currently exist, then “discounted” them to present values to make the costs and revenues comparable.  The $24 billion “funding gap” is thus a present discounted value.)

Several key assumptions drive these estimates.

First, the plan explicitly declined to include satellite when it measured availability of broadband.

Second, even if the plan’s authors wanted to include satellite, the choice of the 4 mbps benchmark also excludes all but the most expensive residential satellite broadband plans.  Perhaps more importantly, the 4 mbps benchmark also allows the plan to ignore “third generation” wireless Internet as an option for households located in places that don’t have wired Internet. 

These are important omissions, because the plan reports that 98 percent of Americans live in places that have 3G wireless Internet. On the other hand, 95 percent of Americans have access to wired broadband capable of delivering 4 mbps downloads. If we include 3G wireless Internet, only 2 percent of Americans live in places where broadband is not available, rather than 5 percent. In other words, including wireless broadband in the calculation cuts the size of the problem by more than half!  If we include satellite, the number of Americans who don’t have broadband available must be truly miniscule.

Why is 4 mbps the goal, anyway? The plan does not explain this in great detail, but it looks like 4 mbps is the goal because that’s the average speed broadband subscribers currently receive in the US. As a result, the plan picked 4 mbps as the speed experienced by the “typical” broadband user in this country. Only problem is, other figures in the plan show that 4 mbps is not the speed experienced by the “typical” US broadband user. The same graph that shows the average broadband speed is 4.1 mbps (on page 21) also shows that the median speed is 3.1 mbps. Half of broadband users have speeds above the median, and half have speeds below the median; that’s the mathematical definition of a median. When the median is 25 percent below the average, it’s simply not accurate to say that the average shows the speed that a “typical” user receives. The typical user receives a speed slower than 4 mbps.

The 4 mbps figure is also way above the goals other nations have set for broadband; the plan shows that other countries typically seek to ensure that all citizens can connect to broadband at speeds between 0.5 and 2 mbps. A goal in that neighborhood would surely allow most 3G wireless services to count as broadband when estimating availability.

That $24 billion “funding gap” also deserves comment. That’s the amount of subsidy the plan estimates will be required to make 4 mbps broadband available to all Americans.  If you read the plan carefully, you will also find that a whopping $14 billion of that is required to bring broadband to the highest-cost two-tenths of one percent of American housing units — 250,000 homes  (page 138). That works out to $56,000 per housing unit!

One wonders whether most Americans would be willing to spend $56,000 per home to ensure that these last few folks can get broadband that’s as fast as the FCC’s broadband planners have decided they deserve. Here’s another option. A basic satellite broadband package costs about $70 per month. Giving these 250,000 expensive-to-reach households satellite broadband would only cost about $200 million a year. It would cost less than half of that if we actually expect these consumers to pay part of the cost — maybe the same $40 per month the rest of us pay in urban and suburban areas?

That cuts the broadband “funding gap” to $10 billion, plus maybe $100 million a year for the satellite subscriptions. If we abandon the arbitrary 4 mbps definition of “acceptable” broadband speed, so that 3G wireless counts as broadband, the gap would be maybe half that size (since more than half of the people who don’t have wired broadband available do have 3G wireless available).

 And guess what — the broadband plan identifies about $15.5 billion in current subsidies that the FCC could repurpose to support broadband. In other words, the FCC has the ability to solve the broadband funding gap all by itself, without a dime of new money from taxpayers, telephone subscribers, or broadband subscribers!

I’m surprised the plan didn’t point that out; coulda made the five commissioners look like real heroes.

Over on the WashingtonWatch.com blog, I’ve laid out in the simplest terms I could what’s going on in terms of procedure with health care overhaul legislation. The post, called “What is Deeming, Anyway?“, comes in at a mere 900 words… If you’re a real public policy junkie, you might like it.

But what about the transparency oriented processes that President Obama and leaders like Speaker Pelosi promised the public? Recall that the Speaker promised to post the health care bill online for 72 hours before a vote back in September.

There was debate about whether she stuck to her promise then. And it was probably a one-time promise. It’s almost certain that she will not do so now. If she lines up the votes to pass the bill, the vote will happen. Right. Then.

What about President Obama’s promise to put health care negotiations on C-SPAN? The daylong roundtable debate on health care was an engaging illustration of what happens when you do transparent legislating. Voters got a clearer picture of where each side stands—and perhaps saw that there actually is some competence on both sides of the aisle. Some competence.

The health care negotiations going on right now are the ones that matter. This is when the most important details are being hammered out. This is when the bargaining that draws the public’s ire is happening. But I’m not seeing it on C-SPAN.

President Obama’s promise may have been naive, but that doesn’t excuse it. The inside negotiations going on this week represent an ongoing violation of the president’s C-SPAN promise.

And there’s good reason to anticipate that the president will violate his Sunlight Before Signing promise as well. This was his promise to post bills online for five days after he receives them from Congress before signing them into law. Continue reading →

I’m livetweeting today’s final FTC Privacy Roundtable (check out the #FTCPriv hashtag on Twitter). Check out the day’s agenda or watch the webcast here. Adam Thierer and I expressed our concerns about the rush to regulation at the First Roundtable back in December—see my written comments and Adam’s summary of his remarks. David Vladeck, Director of the Bureau of Consumer Protection offered the following summary of the Roundtable process at the kick-off this morning:

  1. Benefits & risks of technology. “March of technology has blurred and threatens to obliterate the distinction between PII [personally identifiable information) and non-PII…. It’s getting harder and harder for users to choose anonymity.”
  2. Privacy challenges raised by emerging business models. What do consumers know? Consumers are often presented with confusing and unfamiliar situations. Consumers understand little about how their information is handled.
  3. Innovation in disclosure. Industry is testing privacy icons.
  4. Privacy policies are too vague, too long, too complicated and too hard to find. We need effective ways to disclose what information is being collected and to give consumers a meaningful way to control its use. There’s no way to put the genie back in the bottle once information has been shared.

On the critical question of next steps, Vladeck claims the agency isn’t certain where it will go and plans to “sit back” and think about the detailed record before making public a set of detailed recommendations on which the public will be invited to provide input. I’d like to believe him and I hope the agency really does think long and hard about the evidence provided in this process as to the trade-offs inherent in increased regulation, the complexity of this space, and the need for a cautious approach when it comes to tinkering with the data flows that are the lifeblood, both technological and financial, of the Internet. But based on their recent public statements, I fear that Vladeck and FTC Chairman Jon Liebowitz have already made up their minds about the need for regulation, and that this process is really just paving the way for a report this summer that will call for sweeping new legislation—just as the FTC did back in its 2000 Report to Congress. Continue reading →

Here’s a brief audio clip that PFF’s new press director Mike Wendy helped me put together in which I outline some of my reservations with the Federal Communications Commission’s (FCC) just-released National Broadband Plan. It’s just 4 minutes. Just click the play button below.

[display_podcast]