Posts tagged as:

Better late than never, I’ve finally given a close read to the Notice of Inquiry issued by the FCC on June 17th.  (See my earlier comments, “FCC Votes for Reclassification, Dog Bites Man”.)  In some sense there was no surprise to the contents; the Commission’s legal counsel and Chairman Julius Genachowski had both published comments over a month before the NOI that laid out the regulatory scheme the Commission now has in mind for broadband Internet access.

Chairman Genachowski’s “Third Way” comments proposed an option that he hoped would satisfy both extremes.  The FCC would abandon efforts to find new ways to meet its regulatory goals using “ancillary jurisdiction” under Title I (an avenue the D.C. Circuit had wounded, but hadn’t actually exterminated, in the Comcast decision), but at the same time would not go as far as some advocates urged and put broadband Internet completely under the telephone rules of Title II.

Continue reading →

The Federal Communications Commission has an open proceeding in which it seeks advice on how to repurpose universal service subsidies for phone service in high cost areas to subsidize broadband instead. The FCC apparently wants to subsidize broadband with a minimum download speed of 4 megabytes per second (mbps) and upload speed of 1 mbps. These are the goals proposed in the commission’s National Broadband Plan.

I’m no lawyer, but I wonder if the FCC can do this legally. Section 254 of the Telecommunications Act of 1996 lays out criteria the FCC is supposed to consider when it decides whether to provide universal service subsidies for new services in addition to phone service. One of the criteria is that the new service must be subscribed to by a “substantial majority” of residential consumers.

Sixty-five percent of Americans have broadband at home. (National Broadband Plan, p. 167)  But a minority of residential customers subscribe to broadband that meets the FCC’s 4 mbps/1 mbps definition. According to the FCC’s Omnibus Broadband Initiative technical report on the “Availability Gap” (p. 43), 48 million subscribers have download speeds of 4 mbps or higher. More subscribers – 53 million – have broadband download speeds of 3 mbps or lower. And 35 percent of Americans have no broadband at all. These figures imply that a “substantial majority” of Americans have not subscribed to broadband that meets the National Broadband Plan’s proposed definition.

Based on figures in the technical report, I calculated that approximately 59 percent of Americans subscribe to broadband with a download speed of 768 kbps or higher. Perhaps this figure qualifies as a “substantial majority,” but surely the 4 mbps/1 mbps definition does not.

A reasonable person might also question whether even 59 percent counts as a “substantial majority” for the purpose of declaring broadband a service eligible for subsidy. Surely Section 254 requires a “substantial majority” in part to ensure that consumers who have chosen not to subscribe to a service do not bear the injustice of having to subsidize the provision of that service to others. It is clear from the FCC’s figures that most of the 35 percent of American households without broadband have it available but choose not to subscribe. Therefore, subsidizing even 768 kbps broadband would force many consumers to pay universal service assessments to provide others with a subsidized service that they themselves have decided is not worth the cost.

Wait and see how the FCC addresses this issue once it starts creating a universal service program for broadband.

National Economic Council Director Lawrence Summers made a major policy speech yesterday at the New America Foundation, announcing the adminstration’s plan to find an additional 500 megaherz of spectrum for wireless broadband service by the end of the decade. The spectrum will come from two places: federal agencies who currently under-utilize their spectrum, and commercial users who volunteer to participate in “incentive auctions.”

In an incentive auction, the current spectrum user receives part of the proceeds in exchange for making the spectrum available for reallocation. Within the current US system of spectrum allocation, it’s about as close as we can come to allowing spectrum holders to sell their spectrum licenses to someone else who can put the spectrum to a more valuable use. 

Summers even mentioned broadcasters specifically, noting that a local television station with a few hundred millions of dollars of revenue may currently control spectrum worth hundreds of millions of dollars. Federal agencies would get to use some of the proceeds to adopt “state-of-the-art communications.” Presumably this would include new equipment that doesn’t use so much spectrum.

In his speech, Summers gave appropriate credit to the Federal Communications Commission, which surfaced many of these ideas in its National Broadband Plan. Even more appropriately, the former Harvard University president and academic economist assigned proper credit for the original source of the idea: 

Most of the freed-up spectrum will be auctioned off for use by mobile broadband providers. As the great law and economics scholar Ronald Coase originally pointed out, auctions ensure that spectrum is devoted to its most productive uses because it is determined by investors’ willingness to pay for it.

There are, of course, a few unanswered questions. How much of the spectrum will actually get auctioned for mobile broadband, rather than reserved for unlicensed use? Will the buyers have to use the spectrum for mobile broadband, or will the license be sufficiently broad that they could use it for other forms of personal communication that perhaps haven’t even been invented yet? Do we really have to wait ten years for this? Will the Ronald Coase Institute get any royalties for the government’s use of its namesake’s intellectual property? (Academics will recognize the joke in the last question.)

For now I’ll just say, “Bravo, Dr. Summers!”

Not surprisingly, FCC Commissioners voted 3 to 2 today to open a Notice of Inquiry on changing the classification of broadband Internet access from an “information service” under Title I of the Communications Act to “telecommunications” under Title II.  (Title II was written for telephone service, and most of its provisions pre-date the breakup of the former AT&T monopoly.)  The story has been widely reported, including posts from The Washington Post, CNET, Computerworld, and The Hill.

As CNET’s Marguerite Reardon counts it, at least 282 members of Congress have already asked the FCC not to proceed with this strategy, including 74 Democrats.

I have written extensively about why a Title II regime is a very bad idea, even before the FCC began hinting it would make this attempt.  I’ve argued that the move is on extremely shaky legal grounds, usurps the authority of Congress in ways that challenge fundamental Constitutional principles of agency law, would cause serious harm to the Internet’s vibrant ecosystem, and would undermine the Commission’s worthy goals in implementing the National Broadband Plan.  No need to repeat any of these arguments here.  Reclassification is wrong on the facts, and wrong on the law. Continue reading →

Today, the Federal Communications Commission (FCC) voted along party lines to adopt a Notice of Inquiry opening a new proceeding to regulate the Internet by reclassifying it under Title II of the Communications Act. FCC Chairman Julius Genachowski calls this his “Third Way” plan. In a PFF press release, I issued the following response:

In its ongoing ‘by-any-means-necessary’ quest to regulate the Internet via Net Neutrality mandates, Chairman Genachowski’s FCC continues to flaunt the rule of law and magically invent its own authority as it goes along. If this Chairman wants to bring the Net under his thumb and regulate broadband networks like plain-vanilla public utilities, he should ask Congress for the authority to pursue such imperial ambitions. As the law stands today, the FCC has no such authority. Indeed, the unambiguously deregulatory thrust of the Telecom Act of 1996 stands in stark contrast to Chairman Genachowski’s outdated vision for Big Government Broadband. The FCC stands on the cusp of killing one of the great deregulatory success stories of modern economic history by reviving the discredited regulatory industrial policies of the 19th Century. The revisionism about that epoch is dead wrong: Price controls and protected markets limited choice and stifled innovation. With the agency rolling back the regulatory clock in this fashion, today marks the beginning of the Internet’s “Lost Decade” of stymied investment, innovation, and job creation as all sides wage battle over the legality of reclassification and its implementation.

This is a post for all those broadband fans out there who want to talk about something today besides the Federal Communication’s Commission’s decision to take comments on which legal classification it should use to regulate broadband.

A recent FCC survey revealed that 80 percent of home broadband users do not know the speed of their broadband service. I can easily imagine how this statistic could be spun to “prove” that consumers are woefully uninformed and the broadband market must be plagued with “market failures” because consumers do not have even the basic information they need to make intelligent decisions.

Before we go down that road, let me explain, based on my own experience, why this is a non-issue.

I’m part of that 80 percent. I do not know the speed of my broadband service at home.  I know that when I signed up several years ago, I selected the slowest and cheapest broadband speed the provider offered.  I also know that this speed is still plenty fast for anything we need to do at home (and usually faster than the speed at my university office). I remain blissfully ignorant of the actual speed, even though it would be very easy for me to find out by looking at the materials I received when I signed up or checking the provider’s web site online.

In economic jargon, I am “rationally ignorant” of my home broadband speed. I don’t know (or remember) the speed, but to me this information is not worth the 45 seconds it would take me to find out. And that also means any FCC initiatives to “improve consumer information” or “educate” me about it will not, for me, be worth the time and money the FCC might spend on them.

If some of our Internet applications were not working in a satisfactory manner, we would probably do an online speed test, check to see what other speeds our provider offers, and check offers from competing providers. All of these steps would be easy and would require no FCC policy initiatives to facilitate (beyond making sure that the providers aren’t lying about what speeds they will provide).

I’m probably not alone.  The same survey reveals that 50 percent of Americans are satisfied with their broadband speeds, and another 41 percent are “somewhat satisfied.” So, 91 percent of consumers are more or less satisfied, even though 80 percent don’t know their speeds.

It would have been quite useful and instructive if the FCC survey had included an additional question: “Is your broadband speed adequate for the Internet applications you want to use?” And then cross-tabulate the responses with the responses on knowledge of broadband speed. Wanna bet that a substantial majority of people who do not know their speed would also have said that it is adequate?

Surely there are some broadband customers who use applications that require specific (fast) speeds, and these customers have a greater need to know what speed they’re receiving. That’s why providers tell prospective customers what speed tiers they offer. And that’s why one can find multiple web-based speed tests. This information is not hard to find if you want it.

But for some of us, it just ain’t worth it. And shame on anyone who tries to use my willful ignorance as an excuse for some new policy initiative. Rational ignorance is bliss, and I’m a bliss-ter.

The grandly-named Public Domain Archive, evidently a production of Osaka-based Digirock, Inc., offers a few MP3s of classical music and historical speeches. Thanks to a suggestion from Tyler Cowen, I’m enjoying a 1942 recording of Beethoven’s 9th even as I type. Am I breaking the law in so doing? The copyright notice posted on the Public Domain Archive, while quite charming, hardly reassures:

To the People In japan, All files open to the public on this site are certainly lawful. But, if you do not live in Japan, You might do not have to use files. You should check the law of your country.

As proves too often true for works, like this 1942 recording, that fall under the aegis of the 1909 Copyright Act, it is not easy to figure out if the underylying work enjoys any claim to protection under U.S. law. Perhaps, after all, it was not published with the proper formalities, here, and thus fell into the public domain.

In this case, though, it looks like we can dodge those complications. U.S. copyright law affords exclusive rights only to copying, creation of derivative works, public distribution, public performance, and public display.
See 17 USC § 106. So long as I listen to a MP3 solely via streaming, without saving a copy, it is hard to see how I’ve violated any of those rights. Perhaps Digirock, Inc. has violated U.S. law by offering me the MP3, but that is no concern of mine (and probably not much of a concern to Digirock, Inc.).

That legal scenario suggests an interesting conclusion: an offshore copyright-free zone—one set up by intellectual pirates or in a stubbornly independent country—might give U.S. residents ample, free, and legal access to all sorts of copyrighted works—even ones protected under U.S. law.

[Crossposted at Agoraphilia and The Technology Liberation Front.]

We all pay “universal service” assessments on our phone bills.  It’s even broken out separately; go look. It’s probably just a matter of time before the Federal Communications Commission proposes to slap universal service assessments on broadband service to help pay for universal service subsidies for broadband service. The national broadband plan, after all, calls for “broadening” the universal service funding base.

If the commission reclassifies broadband as a “Title II” telecommunications service, this will be virtually automatic because the Telecommunications Act of 1996 says telecommunications providers must contribute toward the FCC’s universal service fund. If the commission doesn’t reclassify broadband, it could still require contributions — just like it imposed universal service assessments on VOIP without classifying VOIP as telecommunications.

After the FCC starts using universal service funds to subsidize broadband for poor people and rural households, the logic will be seductively compelling: “Broadband receives subsidies, so it’s only fair that broadband pays into the fund.”

Forget the ensuing howls about “taxing the Internet.”  I want to talk about another aspect of this.  Would imposing universal service assessments on broadband actually further the FCC’s goals in its national broadband plan?

Irish Setter Chasing Tail

Photo by nawtydawg.

The FCC wants to make broadband available to all Americans, regardless of where they live. Ideally, the FCC would like us all to subscribe, regardless of our income or where we live. The problem with imposing universal service assessments on broadband is that this would increase the price, leading subscribership to be lower than it would otherwise be.

This effect might be big or it might be little. But before making a decision about imposing universal service assessments on broadband, the FCC ought to know the size of the effect and how it compares to the increase in subscribership that would result from the subsidies.

To figure out how universal service assessments might affect broadband subscribership, we need to know how responsive broadband subscription is to changes in price. Economists call this the “price elasticity of demand.” The most recent study I’ve seen — and the only one cited in the FCC’s technical paper underlying the national broadband plan — estimates the elasticity of broadband demand was about -0.69 in 2008. That means a 1 percent increase in price would lead to a 0.69 percent decrease in subscribership. Other, earlier studies find much higher demand elasticities. But to be conservative, let’s use -0.69.

Current universal service assessments on interstate telecommunications are about 15 percent.  About 66.6 million households had broadband in 2008. A 15 percent increase in the price of broadband would reduce subscribership by about 6.9 million households (15% times -0.69 times 66.6 million.)

If the FCC imposed universal service assessments on broadband, it might be able to lower the rate since it would be collecting assessments from a broader base than just telephone service. Suppose the FCC could lower the assessment to 10 percent, more in line with the historical norm.  A 10 percent increase in the price of broadband would reduce subscribership by 4.6 million households (10% times -0.69 times 66.6 million).

So we’re going to reduce broadband subscribership by 4.6-6.9 million households in order to provide subsidies to increase broadband subscribership.  If the funds currently spent to subsidize phone service in rural areas were spent on broadband, that would be enough money to close the “funding gap” and make broadband available to the 7 million homes the FCC  says currently are unserved or under-served. 

Not all of them will susbcribe, so we can’t assume these subsidies will increase subscribership by 7 million.  About 65 percent of Americans currently have broadband at home.  If 65 percent of unserved or underserved households choose to subscribe once broadband becomes available, that would be  4.55 million new subscribers.

In short, it looks like subjecting all broadband to universal service assessments to pay for rural broadband subsidies would either be a wash or reduce subscribership on net. Paying for universal broadband service with assessments on broadband service will give the FCC a lot to do, but it won’t advance the subscribership goals of the national broadband plan. 

There are other ways to raise the money without this perverse effect. Historically, local telephone subscription has been very insensitive to price, so one option would be for the FCC to simply impose a universal service charge per phone number instead of the current percentage fee.  (Low-income households who have “Lifeline” service or use low-cost prepaid wireless plans could be charged a lower fee without sacrificing much revenue.)

Another option would be for Congress to earmark some revenues from upcoming spectrum auctions to fund universal broadband service, and reduce the universal service assessments on our phone bills accordingly.

Reasonable people can differ on whether, or by how much, the federal government should subsidize broadband where it is not currently available. But if we’re gonna do it, there’s no sense in funding it with a mechanism that reduces broadband subscription elsewhere.

Back on St. Paddy’s Day, I offered a few comments on the “funding gap” identified in the FCC’s just-released national broadband plan. Since then, the FCC has put out a notice of proposed rulemaking and notice of inquiry seeking public comment on reforms that would allow its universal service fund to subsidize broadband. The FCC has also released a 137-page technical paper that details how the staff calculated the broadband “availability gap” and funding gap.

So, now there’s more to chew on, and another round of online mastication would be timely given the open FCC proceeding.  Here are three big issues:

  1. Definition of broadband

The plan announced a goal of making broadband with actual download speeds of 4 mbps available to all Americans.  In the plan, this goal appeared to be based on the actual average speed of broadband service (4 mbps), even though the median speed is just 3.1 mbps (p. 21). The technical paper, however, also projects that, based on past growth rates in broadband speed, “the median will likely be higher than 4 mbps by the end of 2010.” (p. 43)  Contrary to what I thought back in March, it appears the FCC is justifying the 4 mbps goal based on the median speed, not the average. 

The technical report also argues that 4 mbps is necessary to run high-speed video, which a “growing portion of subscribers” (not including me) apparently use. (p. 43) So, if the broadband plan achieves its goals, every Amercian will have the opportunity to subscribe to Internet access capable of delivering high-quality porn! Fortunately, the technical report uses a different and more productive example — streamed classroom lectures. 

Reasonable people could still question whether the median is the appropriate benchmark to guide government actions intended to equalize broadband access opportunities.  The technical report includes a helpful graphic that shows the most common broadband speed users actually buy is 2 mbps, and 38 percent of all subscribers have speeds of 2 mbps or less. (p. 43) The FCC staff’s model calculates that if the goal were set at 1.5 mbps, the number of “unserved” households would fall from 7 million to 6.3 million, and the required subsidy would fall from $18.6 billion to $15.3 billion. (p. 45) 

If almost half of broadband subscribers have decided that something less than 4 mbps is perfectly adequate, that suggests 4 mbps may go far beyond what is necessary to ensure that all Americans have access to basic broadband service. So, that 4 mbps goal is still questionable.

  1. Omission of 3G wireless

The 4 mbps goal allowed the FCC to ignore third generation wireless when it estimated the “availability gap.” The technical paper shows that 95 percent of households have 4 mbps broadband available. About 3 percent of households have no broadband available, while 2 percent have broadband available at speeds ranging from 384 kbps – 3 mbps. (p. 17)  That 2 percent probably includes households with slow DSL and 3G wireless.

The technical paper also revealed that it did not include service from fixed Wireless Internet Service Providers due to data availability. (p. 25) These serve 2 million subscribers in rural areas (p. 66), so the omission potentially accounts for a large chunk of the households considered “unserved.” No telling how many, since apparently the data aren’t available.

Back in March, I guesstimated that the 7 million household “availability gap” might overstate the size of the problem by more than half, simply because 3G wireless is available to 98 percent of American households. Looks like my guesstimate is pretty much in line with the more detailed figures in the FCC technical paper.

 3. Role of satellite

The broadband plan did not count satellite broadband when assessing availability. The technical paper (pp. 89-94)provides a much more detailed explanation of the capacity constraints the FCC staff believes will prevent satellite broadband from serving more than a couple million subscribers.   (The current satellite subscriber base is approximately 900,000.)

The technical paper pointed out that satellites are expensive and take three years to build. (p. 92) To put the time frame in perspective, that’s about as long as the FCC and the Federal-State Joint Board on Universal Service have been discussing universal service subsidies for broadband. Lord knows we shouldn’t make consumers wait that long!

There is, however, something a little asymmetrical about the way the FCC staff treated satellite and other forms of broadband. The point of estimating the broadband availability gap was to determine how much of a subsidy would be required to induce the private sector to build the infrastructure to close the gap. But while the study assumed that the subsidies would call forth the requisite cable, DSL, and wireless infrastructure within some unnamed but acceptable time frame, it decided that three years is just too long to wait for satellite infrastructure to expand. So, satellite plays a minimal role in the FCC’s plan.

Yet even this minimal role has a big impact. To its credit, the technical paper calculated how satellite broadband could dramatically slash the cost of serving the most expensive 250,000 homes. It estimated (pp. 91-92) that the net present value of subsidies required to serve these homes with satellite would range between $800 million and $2 billion — compared to a $13.4 billion subsidy required to serve these homes with terrestrial broadband. (This implies an annual subsidy of $105-255 million, which is pretty close to my March 17 guesstimate of $100-200 million.)

So, satellite broadband could help prevent costs from skyrocketing, even assuming it plays only the limited role envisioned in the FCC staff’s analysis.

The UK’s Daily Mail reports that Phil Bissett, a 62 year old former gravedigger, transformed a steel casket into a street-legal single-seat automobile that does 100 mph, using the engine from his daughter’s 1972 VW. He acquired the casket — you guessed it — on ebay.

Digging in: Phil Bissett has dubbed his crazy new creation 'Holy Smoke'.jpeg

Now here’s where it gets interesting. The casket originally cost 1500 British pounds. He got it for just 98 pounds — about $146 at today’s exchange rate.  That’s 93 percent off!  The article doesn’t say how much he paid for the assorted spare parts from other vehicles needed to turn the casket into an automobile, nor does it explain what his daughter is doing for transportation now that the engine from her car powers his deathmobile.  Still, it’s a nice-looking little sports car, and I’ll bet it cost less and is more reliable than that fine piece of British automotive engineering I used to own, an MG Midget.

Bissett told the reporter, “I’ve learned never to go on the internet when you’ve had a drink. My friend said I’d never be able to turn it into a car but I knew I could.”

This must be what the wonks mean when they say the Internet is an “enabling technology.”

(Be sure to check out the Daily Mail link above to see the cool photos!)