Back on St. Paddy’s Day, I offered a few comments on the “funding gap” identified in the FCC’s just-released national broadband plan. Since then, the FCC has put out a notice of proposed rulemaking and notice of inquiry seeking public comment on reforms that would allow its universal service fund to subsidize broadband. The FCC has also released a 137-page technical paper that details how the staff calculated the broadband “availability gap” and funding gap.

So, now there’s more to chew on, and another round of online mastication would be timely given the open FCC proceeding.  Here are three big issues:

1. Definition of broadband

The plan announced a goal of making broadband with actual download speeds of 4 mbps available to all Americans.  In the plan, this goal appeared to be based on the actual average speed of broadband service (4 mbps), even though the median speed is just 3.1 mbps (p. 21). The technical paper, however, also projects that, based on past growth rates in broadband speed, “the median will likely be higher than 4 mbps by the end of 2010.” (p. 43)  Contrary to what I thought back in March, it appears the FCC is justifying the 4 mbps goal based on the median speed, not the average. 

The technical report also argues that 4 mbps is necessary to run high-speed video, which a “growing portion of subscribers” (not including me) apparently use. (p. 43) So, if the broadband plan achieves its goals, every Amercian will have the opportunity to subscribe to Internet access capable of delivering high-quality porn! Fortunately, the technical report uses a different and more productive example — streamed classroom lectures. 

Reasonable people could still question whether the median is the appropriate benchmark to guide government actions intended to equalize broadband access opportunities.  The technical report includes a helpful graphic that shows the most common broadband speed users actually buy is 2 mbps, and 38 percent of all subscribers have speeds of 2 mbps or less. (p. 43) The FCC staff’s model calculates that if the goal were set at 1.5 mbps, the number of “unserved” households would fall from 7 million to 6.3 million, and the required subsidy would fall from $18.6 billion to $15.3 billion. (p. 45) 

If almost half of broadband subscribers have decided that something less than 4 mbps is perfectly adequate, that suggests 4 mbps may go far beyond what is necessary to ensure that all Americans have access to basic broadband service. So, that 4 mbps goal is still questionable.

2. Omission of 3G wireless

The 4 mbps goal allowed the FCC to ignore third generation wireless when it estimated the “availability gap.” The technical paper shows that 95 percent of households have 4 mbps broadband available. About 3 percent of households have no broadband available, while 2 percent have broadband available at speeds ranging from 384 kbps – 3 mbps. (p. 17)  That 2 percent probably includes households with slow DSL and 3G wireless.

The technical paper also revealed that it did not include service from fixed Wireless Internet Service Providers due to data availability. (p. 25) These serve 2 million subscribers in rural areas (p. 66), so the omission potentially accounts for a large chunk of the households considered “unserved.” No telling how many, since apparently the data aren’t available.

Back in March, I guesstimated that the 7 million household “availability gap” might overstate the size of the problem by more than half, simply because 3G wireless is available to 98 percent of American households. Looks like my guesstimate is pretty much in line with the more detailed figures in the FCC technical paper.

 3. Role of satellite

The broadband plan did not count satellite broadband when assessing availability. The technical paper (pp. 89-94)provides a much more detailed explanation of the capacity constraints the FCC staff believes will prevent satellite broadband from serving more than a couple million subscribers.   (The current satellite subscriber base is approximately 900,000.)

The technical paper pointed out that satellites are expensive and take three years to build. (p. 92) To put the time frame in perspective, that’s about as long as the FCC and the Federal-State Joint Board on Universal Service have been discussing universal service subsidies for broadband. Lord knows we shouldn’t make consumers wait that long!

There is, however, something a little asymmetrical about the way the FCC staff treated satellite and other forms of broadband. The point of estimating the broadband availability gap was to determine how much of a subsidy would be required to induce the private sector to build the infrastructure to close the gap. But while the study assumed that the subsidies would call forth the requisite cable, DSL, and wireless infrastructure within some unnamed but acceptable time frame, it decided that three years is just too long to wait for satellite infrastructure to expand. So, satellite plays a minimal role in the FCC’s plan.

Yet even this minimal role has a big impact. To its credit, the technical paper calculated how satellite broadband could dramatically slash the cost of serving the most expensive 250,000 homes. It estimated (pp. 91-92) that the net present value of subsidies required to serve these homes with satellite would range between $800 million and $2 billion — compared to a $13.4 billion subsidy required to serve these homes with terrestrial broadband. (This implies an annual subsidy of $105-255 million, which is pretty close to my March 17 guesstimate of $100-200 million.)

So, satellite broadband could help prevent costs from skyrocketing, even assuming it plays only the limited role envisioned in the FCC staff’s analysis.

Final reminder about tomorrow’s PFF event on “Can Government Help Save the Press?”  Again, the event will take place from 9 a.m. – 11:00 a.m. in the International Gateway Room, Mezzanine Level of the Ronald Reagan Building on 1300 Pennsylvania Ave, N.W. here in DC.   This event will examine the FCC’s “Future of Media” proceeding and debate what role the government should play (if any) in sustaining struggling media enterprises, “saving journalism,” or promoting more “public media” or “public interest” content.  The event includes a keynote address by Ellen P. Goodman, who is a Distinguished Visiting Scholar at the FCC and is assisting the FCC’s Future of Media team.  After Ellen Goodman brings us up to speed with where the FCC’s Future of Media process stands, we’ll hear from a diverse panel of experts that includes:

Hope to see some of you tomorrow morning at 9:00!

In this week’s episode of the Surprisingly Free Podcast, I talk to David Post, the I. Herman Stern Professor of Law at the Beasley School of Law at Temple University and author of In Search of Jefferson’s Moose: Notes on the State of Cyberspace. He discusses the general state of the internet and contrasts a decentralized Jeffersonian approach to the internet with a more centralized Hamiltonian one. He also addresses netizenship, open vs. closed systems, and online global relations.

Do check out the interview, and consider subscribing to the show on iTunes. Past guests have included James Grimmelman on online harassment and the Google Books case, Michael Geist on ACTA, Tom Hazlett on spectrum reform, and Tyler Cowen on just about everything.

Coming up in the next few weeks we’ll have TLF’s own Adam Thierer, as well as Nick Carr, Clay Shirky, Gina Trapani, and many more great guests! So what are you waiting for? Subscribe!

Declan McCullagh of CNet News reports (“Congress May Roll Dice, Legalize Net Gambling“) that some in Congress are reconsidering the wisdom of prohibitions on Internet gambling, which we have discussed here many times before. Declan notes there’s another hearing on the issue today and Rep. Barney Frank (D-MA) will be discussing his continuing effort to allow Internet casinos to obtain licenses from and be regulated by the federal government:

Frank, who will be testifying during Wednesday’s hearing, says that because nearly all states already permit some form of traditional gambling–including lotteries, betting on horse and greyhound racing, and sports wagering — the federal government should legalize and regulate the online equivalents. Instead of a blanket legalization, his legislation would require the Treasury Department to police the industry and ensure that it takes adequate steps to identify minors and compulsive gamblers.

My TLF colleague Tom Bell has done seminal work in this field and you will definitely want to check out his recent essay, “The UnInGEn-ious Act’s Non-Impact on Internet Gambling” and his classic 1999 Cato white paper, “Internet Gambling: Popular, Inexorable, and (Eventually) Legal.”  What Tom has done better than anyone else is to show that, as is the case with almost every “market activity devoted to the pursuit of happiness,” eventually the law will adjust to accommodate these activities.  It may take some time for the law to adjust, but it will.

Incidentally, I loved this little gem of a quote that Declan included in his story from the activist group Focus on the Family, which argues of this effort to legalize online gambling: Continue reading →

If you happen to belong to the DC Bar’s Computer and Telecommunications Law Section, I hope you will consider casting one of your three votes for me when you complete your ballot for the Section’s Steering Committee—which you probably received in the mail today (as I did). Ballots must be received by June 4. I used my 75 words for the following mini-bio on the ballot:

I direct the Internet policy program at The Progress & Freedom Foundation.  I practiced communications and cyber law at Latham & Watkins and Lawler Metzger Milkman & Keeney after a district clerkship and graduating Virginia Law in 2004. I have particular expertise with the FCC, FTC and ICANN, and in online privacy, advertising, e-commerce, free speech, Internet governance and satellite law. I am eager use my panel-planning experience to help the Section do more events.

Find more on my work here. This is a volunteer position that allows lawyers interested in tech policy to give back to the legal/policy community here in DC, primarily by offering the high quality CLE programming for which the DC Bar is so well-respected (which are open to all).

I hope to have the opportunity to serve. While there are a number of fine candidates, I plan on casting my other two votes for NetChoice’s Braden Cox, my fellow TLF co-blogger, and Grace Koh, whom I have gotten to know through her work on privacy and other policy issues at Cox Enterprises. I’m Berin Szoka, and I endorse this ad.

I’ve complained mightily (here and here) about the agonizing technological awfulness that was, at least until recently the website of the FCC (you know, one of the two federal agencies—besides the FTC—that thinks it has the expertise necessary to regulate the Internet). My point wasn’t just that the FCC’s website made it very difficult to find and access data, but that this was a serious problem for transparency in government. I have to give the agency credit for improving many aspects of its site, though much work still remains to be done.

But then there are all the other agencies of our sprawling regulatory Leviathan! And in particular, the Securities and Exchange Commission (SEC), which processes—crudely—huge amounts of financial data. A new report from House Oversight and Government Reform Committee Ranking Member Darrell Issa released today describes just how severe the SEC’s problems are:

The Commission’s securities disclosure processes are technologically backward.  It reviews corporate filings manually, using printouts, pencils, and calculators.  It has never developed the ability to perform large-scale quantitative analysis to find fraud.  Commission staff use Google Finance, Yahoo! Finance, and other commercially-available resources to analyze corporate filings.  If the Commission had a robust database of the financial information filed by its registrants, it could automatically prioritize the thousands of tips and complaints it receives.  But no such database has ever been constructed.

Hence the biting title of the report: The SEC: Designed for Failure. Ouch! It’s really amazing how, when regulators fail to protect consumers, the default response by most in Congress is to assume that only sweeping new powers will fix the problem (which is what “financial reform” legislation would do) instead of, say, bringing the agency into the 21st century.

Similarly, there’s a move afoot to give the FTC vast new powers across the board or to protect our privacy online (from evil companies that don’t respect the privacy promises they made to consumers) with little thought given to data-driven technological  through user empowerment. Continue reading →

In light of the discussion draft of privacy legislation recently released by Chairman Rick Boucher (our comments here and here), PFF is holding a special “Nuts & Bolts” luncheon briefing on the technical underpinnings of the ongoing privacy policy debate on Monday, May 24, 2010, 12-2 p.m. in 2123 Rayburn House Office Building.

Our panel of distinguished experts will provide an overview of the technical mechanics of online advertising and associated concerns about data collection, and discuss challenges and opportunities for empowering privacy-sensitive consumers to manage their online privacy without breaking the advertising business model that sustains most Internet content and services. I’ll moderate a terrific panel:

To Register: Space is limited, so an RSVP is required to attend.  Please register online here. Event questions should be addressed to Adam Marcus at amarcus@pff.org.  Media inquiries should be directed to Mike Wendy at mwendy@pff.org.

Adam Thierer & I offered our initial thoughts upon first reading the discussion draft of the privacy bill introduced by Rep. Rick Boucher (D-VA) & Cliff Stearns (R-FL). In PFF’s latest TechCast, I sat down to discuss the bill and my concerns about it with PFF’s VP for Communications, Mike Wendy:

Stay tuned for more from us on this. PFF plans to file written comments, as solicited by the bill’s authors, by June 4. For more on this, check out our comments to the FTC last December on these issues.

Subscribe now to PFF’s TechCast podcast (generally 5-8 minutes) by RSS or through iTunes!

I have a lot of respect for danah boyd and have had the pleasure to interact with her when we both served on the Harvard online child safety task force, and at other times. She’s a very gifted social media researcher.  But there are three big problems with her argument that Facebook should be treated as a “utility” and regulated as such. (See: “Facebook is a Utility; Utilities Get Regulated.”)

What a Utility Is, and Isn’t

First, and most obviously, the term “utility” has a fairly well-understood meaning in economic literature and Facebook does not possess the same qualities:

  • A utility is usually something thought to be an “essential facility” in that the service or network in question is highly unique and possess few (or no) good alternatives. (Regulators typically require “non-discriminatory access” for that reason.)
  • The service in question is also typically regarded as being something approximating a “life-essential” service, like water or electricity.  (Regulators typically require all to be served in a fairly uniform fashion for that reason.)
  • The service is also something that typically entails significant fixed costs and that requires us to pay good money to use. (Regulators typically impose price regulation for fear of “gouging” for that reason.)

Again, Facebook possess none of those qualities.   Continue reading →

Google has just announced that it is ending web-only sales of its unsubsidized Nexus One smartphone. The company had hoped to created a very different kind of business model for mobile phone retailing, but it just didn’t work and so they are ending the experiment.

There are a couple of reasons that it probably didn’t work, but the one thing that just about everyone is pointing back to is the difficulty of acclimating Americans to the actual cost of an unsubsidized handset. Over at Ars Technica, Peter Bright points out:

A one-off payment of $529 is hard to stomach. In many countries, we’re not accustomed to paying so much for mobile phones, as normally their true cost is hidden—we pay less up front and commit to paying a monthly fee for 12-24 months. Only those brave souls who were willing to stump up for the early termination fee would get any idea of the true cost of their handset. In a world of subsidized handsets, then, the Nexus one felt very expensive. It’s true that SIM-only contracts are cheaper than with-handset ones, but the difference rarely feels significant enough to justify buying a full-price phone—much better to pay a little bit more each month and avoid the up-front cost. Even if you do the math and work out that the Google way is cheaper, there’s still the unpleasant prospect of spending so much at once.

And Kevin C. Tofel of GigaOm concludes:

Continue reading →