Larry Downes – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Sun, 28 Sep 2014 18:37:53 +0000 en-US hourly 1 6772528 Trust (but verify) the engineers – comments on Transatlantic digital trade https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/ https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/#comments Sun, 28 Sep 2014 18:29:33 +0000 http://techliberation.com/?p=74825

Last week, I participated in a program co-sponsored by the Progressive Policy Institute, the Lisbon Council, and the Georgetown Center for Business and Public Policy on “Growing the Transatlantic Digital Economy.”

The complete program, including keynote remarks from EU VP Neelie Kroes and U.S. Under Secretary of State Catherine A. Novelli, is available below.

My remarks reviewed worrying signs of old-style interventionist trade practices creeping into the digital economy in new guises, and urged traditional governments to stay the course (or correct it) on leaving the Internet ecosystem largely to its own organic forms of regulation and market correctives:

Vice President Kroes’s comments underscore an important reality about innovation and regulation. Innovation, thanks to exponential technological trends including Moore’s Law and Metcalfe’s Law, gets faster and more disruptive all the time, a phenomenon my co-author and I have coined “Big Bang Disruption.” Regulation, on the other hand, happens at the same pace (at best). Even the most well-intentioned regulators, and I certainly include Vice President Kroes in that list, find in retrospect that interventions aimed at heading off possible competitive problems and potential consumer harms rarely achieve their objectives, and, indeed, generate more harmful unintended consequences. This is not a failure of government. The clock speeds of innovation and regulation are simply different, and diverging faster all the time. The Internet economy has been governed from its inception by the engineering-driven multistakeholder process embodied in the task forces and standards groups that operate under the umbrella of the Internet Society.   Innovation, for better or for worse, is regulated more by Moore’s Law than traditional law. I happen to think the answer is “for better,” but I am not one of those who take that to the extreme in arguing that there is no place for traditional governments in the digital economy. Governments have and continue to play an essential part in laying the legal foundations for the remarkable growth of that economy and in providing incentives if not funding for basic research that might not otherwise find investors. And when genuine market failures appear, traditional regulators can and should step in to correct them as efficiently and narrowly as they can. Sometimes this has happened. Sometimes it has not. Where in particular I think regulatory intervention is least effective and most dangerous is in regulating ahead of problems—in enacting what the FCC calls “prophylactic rules.” The effort to create legally sound Open Internet regulations in the U.S. has faltered repeatedly, yet in the interim investment in both infrastructure and applications continues at a rapid pace—far outstripping the rest of the world. The results speak for themselves. U.S. companies dominate the digital economy, and, as Prof. Christopher Yoo has definitively demonstrated, U.S. consumers overall enjoy the best wired and mobile infrastructure in the world at competitive prices. At the same time, those who continue to pursue interventionist regulation in this area often have hidden agendas. Let me give three examples: 1.  As we saw earlier this month at the Internet Governance Forum, which I attended along with Vice President Kroes and 2,500 other delegates, representatives of the developing world were told by so-called consumer advocates from the U.S. and the EU that they must reject so-called “zero rated” services, in which mobile network operators partner with service providers including Facebook, Twitter and Wikimedia to provide their popular services to new Internet users without use applying to data costs. Zero rating is an extremely popular tool for helping the 2/3 of the world’s population not currently on the Internet get connected and, likely, from these services to many others. But such services violate the “principle” of neutrality that has mutated from an engineering concept to a nearly-religious conviction. And so zero rating must be sacrificed, along with users who are too poor to otherwise join the digital economy. 2.  Closer to home, we see the wildly successful Netflix service making a play to hijack the Open Internet debate into one about back-end interconnection, peering, and transit—engineering features that work so well that 99% of the agreements involved between networks, according to the OECD, aren’t even written down. 3.  And in Europe, there are other efforts to turn the neutrality principle on its head, using it as a hammer not to regulate ISPs but to slow the progress of leading content and service providers, including Apple, Amazon and Google, who have what the French Digital Council and others refer to as non-neutral “platform monopolies” which must be broken. To me, these are in fact new faces on very old strategies—colonialism, rent-seeking, and protectionist trade warfare respectively. My hope is that Internet users—an increasingly powerful and independent source of regulatory discipline in the Internet economy—will see these efforts for what they truly are…and reject them resoundingly. The more we trust (but also verify) the engineers, the faster the Internet economy will grow, both in the U.S. and Europe, and the greater our trade in digital goods and services will strengthen the ties between our traditional economies. It’s worked brilliantly for almost two decades. The alternatives, not so much.
]]>
https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/feed/ 4 74825
Net Neutrality Returns – As Farce https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/ https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/#respond Wed, 11 Sep 2013 17:20:16 +0000 http://techliberation.com/?p=73530

Over on Forbes today, I have a very long post inspired by Monday’s oral arguments in Verizon’s challenge of the FCC’s Open Internet rules, passed in 2010

I say “inspired” because the post has nothing to say about the oral arguments which, in any case, I did not attend.  Mainstream journalists can’t resist the temptation to try to read into the questions asked or the mood of the judges some indication of how the decision will come out

But as anyone who has ever worked in a court or followed appellate practice  well knows, the tone of oral arguments signals nothing about a judge’s point-of-view.  Often, the harshest questioning is reserved for the side a judge is leaning towards supporting, perhaps because the briefs filed were inadequate.  Bad briefs create more work for the judge and her clerks.

I use the occasion of the hearing to take a fresh look at the net neutrality “debate,” which has been on-going since at least 2005, when I first started paying attention to it.  In particular, I try to disentangle the political term “net neutrality” (undefined and, indeed, not even used in the 2010 Open Internet order) from the engineering principles of packet routing.

According to advocates for government regulation of broadband access, the political argument for net neutrality regulation is simply a codification of the Internet’s design.  But regardless of whether it would even make sense to transform the FCC into the governing body of engineering protocols for the network (the Internet Society and the its engineering task forces are and always have been doing a fine job, thanks very much), the reality is that the political argument has almost nothing to do with the underlying engineering.

Indeed, those most strongly advocating for more government regulation either don’t understand the engineering or intentionally mischaracterize it, or both.  That’s clear from the wide range of supposed competitive problems that have been lumped together under the banner of “net neutrality” issues over the years–almost none of which have anything to do with packet routing.

Fortunately, very little of the larger political agenda of the loose coalition of net neutrality advocates is reflected in the rules ultimately passed by a bare majority of the FCC in 2010.  Even so, those rules, limited as they were, face many challenges.

For one thing, the FCC, despite over a year of dedicated attention to the problem, could identify only four incidents that suggested any kind of market failure, and only one of which (the Comcast-BitTorrent incident) was ever actually considered in detail by the Commission.  (Two of the others never even rose to the level of a complaint.)  The agency was left to regulate on the basis of “preserving” the Open Internet through what it called (nearly a dozen times) “prophylactic” rules.

Second, and of particular interest in the D.C. Circuit proceeding, Congress has never authorized the FCC to issue rules dealing with broadband Internet access.  Though many authorizing bills have circulated over the years, none have ever made it out of committee.  With no legal basis to regulate, the agency was left pointing to irrelevant provisions of the existing Communications Act–most of which were already rejected by the same court in the Comcast case.  Nothing in the law has changed since Comcast, and on that basis, regardless of the merits of Internet regulation, the FCC is very likely to lose.  Which the Commission surely knew in passing the rules in 2010.

The piece ends by describing, as I did in my testimony before the House Judiciary Committee in early 2011, how the Report and Order betray the technical reality that from an engineering standpoint, even the supposed neutrality of packet routing is largely a sentimental myth.  The FCC identified and exempted a dozen network management technologies, practices, and protocols that they acknowledged do not follow the neutrality principle, but which are essential to effective and efficient management of the network.  There is no “neutral” Internet to preserve, and never was.

The agency was right to exempt these practices.  But the problem with the rules as written is that they could not and did not extend to future innovations that new applications and new users will certainly make as essential as today’s management techniques.

If the rules stand, network engineers, application developers, device makers and others in the vibrant, dynamic Internet ecosystem will be forced to seek permission to innovate from the FCC, which will both slow the high-speed world of Internet design to a crawl and introduce a decision maker with no technical expertise and lots of political baggage.

That of course was the kind of counter-productive and unnecessary regulatory intrusion that Internet users successfully rose up against last year when the UN’s International Telecommunications Union threatened to assert itself in basic Internet governance, or the year before that when Congress, without technical understanding of the most basic variety, tried to re-architect the Internet  on behalf of media companies in the failed SOPA and PIPA legislation.

If the FCC gains a foothold in broadband access with the Open Internet rules or other efforts to gain oversight where Congress has delegated none, expect a similar reaction.  Or, in any case, hope for one.

]]>
https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/feed/ 0 73530
The Media’s Sound and Fury Over NSA Surveillance https://techliberation.com/2013/06/10/the-medias-sound-and-fury-over-nsa-surveillance/ https://techliberation.com/2013/06/10/the-medias-sound-and-fury-over-nsa-surveillance/#comments Mon, 10 Jun 2013 13:35:59 +0000 http://techliberation.com/?p=44926

***Cross-posted from Forbes.com***

It was, to paraphrase Yogi Berra, déjà vu all over again.  Fielding calls last week from journalists about reports the NSA had been engaged in massive and secret data mining of phone records and Internet traffic, I couldn’t help but wonder why anyone was surprised by the so-called revelations.

Not only had the surveillance been going on for years, the activity had been reported all along—at least outside the mainstream media.  The programs involved have been the subject of longstanding concern and vocal criticism by advocacy groups on both the right and the left.

For those of us who had been following the story for a decade, this was no “bombshell.”  No “leak” was required.  There was no need for an “expose” of what had long since been exposed.

As the Cato Institute’s Julian Sanchez and others reminded us, the NSA’s surveillance activities, and many of the details breathlessly reported last week, weren’t even secret.  They come up regularly in Congress, during hearings, for example, about renewal of the USA Patriot Act and the Foreign Intelligence Surveillance Act, the principal laws that govern the activity.

In those hearings, civil libertarians (Republicans and Democrats) show up to complain about the scope of the law and its secret enforcement, and are shot down as being soft on terrorism.  The laws are renewed and even extended, and the story goes back to sleep.

But for whatever reason, the mainstream media, like the corrupt Captain Renault in “Casablanca,” collectively found itself last week “shocked, shocked” to discover widespread, warrantless electronic surveillance by the U.S. government.  Surveillance they’ve known about for years.

Let me be clear.  As one of the long-standing critics of these programs, and especially their lack of oversight and transparency, I have no objection to renewed interest in the story, even if the drama with which it is being reported smells more than a little sensational with a healthy whiff of opportunism.

In a week in which the media did little to distinguish itself, for example, The Washington Post stood out, and not in a good way.  As Ed Bott detailed in a withering post for ZDNet on Saturday, the Post substantially revised its most incendiary article, a Thursday piece that originally claimed nine major technology companies had provided direct access to their servers as part of the Prism program.

That “scoop” generated more froth than the original “revelation” that Verizon had been complying with government demands for customer call records.

Except that the Post’s sole source for its claims turned out to a PowerPoint presentation of “dubious provenance.”  A day later, the editors had removed the most thrilling but unsubstantiated  revelations about Prism from the article.  Yet in an unfortunate and baffling Orwellian twist, the paper made absolutely no mention of the “correction.”   As Bott points out, that violated not only common journalistic practice but the paper’s own revision and correction policy.

All this and much more, however, would have been in the service of a good cause–if, that is, it led to an actual debate about electronic surveillance we’ve needed for over a decade.

Unfortunately, it won’t.  The mainstream media will move on to the next story soon enough, whether some natural or man-made disaster.

And outside the Fourth Estate, few people will care or even notice when the scandal dies.  However they feel this week, most Americans simply aren’t informed or bothered enough about wholesale electronic surveillance to force any real accountability, let alone reform.  Those who are up in arms today might ask themselves where they were for the last decade or so, and whether their righteous indignation now is anything more than just that.

As Politico’s James Hohmann noted on Saturday, “Government snooping gets civil libertarians from both parties exercised, but this week’s revelations are likely to elicit a collective yawn from voters if past polling is any sign.”

Why so pessimistic?  I looked over what I’ve written on this topic in the past, and found the following essay, written in 2008, which appeared in slightly different form in my 2009 book, “The Laws of Disruption.”   It puts the NSA’s programs in historical context, and tries to present both the costs and benefits of how they’ve been implemented.  It points out why at least some aspects of these government activities are likely illegal, and what should be done to rein them in.

What I describe is just as scandalous, if not moreso, than anything that came out last week.

Yet I present it below with the sad realization that if I were writing it today–five years later–I wouldn’t need to change a single word.  Except maybe the last sentence.  And then, just maybe.

Searching Bits, Seizing Information

U.S. citizens are protected from unreasonable search and seizure of their property by their government.  In the Constitution, that right is enshrined in the Fourth Amendment, which was enacted in response to warrantless searches by British agents in the run-up to the Revolutionary War. Over the past century, the Supreme Court has increasingly seen the Fourth Amendment as a source of protection for personal space—the right to a “zone of privacy” that governments can invade only with probable cause that evidence of a crime will be revealed.

Under U.S. law, Americans have little in the way of protection of their privacy from businesses or from each other. The Fourth Amendment is an exception, albeit one that applies only to government.

But digital life has introduced new and thorny problems for Fourth Amendment law. Since the early part of the twentieth century, courts have struggled to extend the “zone of privacy” to intangible interests—a right to privacy, in other words, in one’s information. But to “search” and “seize” implies real world actions. People and places can be searched; property can be seized.

Information, on the other hand, need not take physical form, and can be reproduced infinitely without damaging the original. Since copies of data may exist, however temporarily, on thousands of random computers, in what sense do netizens have “property” rights to their information? Does intercepting data constitute a search or a seizure or neither?

The law of electronic surveillance avoids these abstract questions by focusing instead on a suspect’s expectations. Courts reviewing challenged investigations ask simply if the suspect believed the information acquired by the government was private data and whether his expectation of privacy was reasonable.

It is not the actual search and seizure that the Fourth Amendment forbids, after all, but unreasonable search and seizure. So the legal analysis asks what, under the circumstances, is reasonable. If you are holding a loud conversation in a public place, it isn’t reasonable for you to expect privacy, and the police can take advantage of whatever information they overhear. Most people assume, on the other hand, that data files stored on the hard drive of a home computer are private and cannot be copied without a warrant.

One problem with the “reasonable expectation” test is that as technology changes, so do user expectations. The faster the Law of Disruption accelerates, the more difficult it is for courts to keep pace. Once private telephones became common, for example, the Supreme Court required law enforcement agencies to follow special procedures for the search and seizure of conversations—that is, for wiretaps. Congress passed the first wiretap law, known as Title III, in 1968. As information technology has revolutionized communications and as user expectations have evolved, the courts and Congress have been forced to revise Title III repeatedly to keep it up to date.

In 1986, the Electronic Communications Privacy Act amended Title III to include new protection for electronic communications, including e-mail and communications over cellular and other wireless technologies. A model of reasonable lawmaking, the ECPA ensured these new forms of communication were generally protected while closing a loophole for criminals who were using them to evade the police. (By 2005, 92 percent of wiretaps targeted cell phones.)

As telephone service providers multiplied and networks moved from analog to digital, a 1994 revision required carriers to build in special access for investigators to get around new features such as call forwarding. Once a Title III warrant is issued, law enforcement agents can now simply log in to the suspect’s network provider and receive real-time streams of network traffic.

Since 1968, Title III has maintained an uneasy truce between the rights of citizens to keep their communications private and the ability of law enforcement to maintain technological parity with criminals. As the digital age progresses, this balance is harder to maintain. With each cycle of Moore’s Law, criminals discover new ways to use digital technology to improve the efficiency and secrecy of their operations, including encryption, anonymous e-mail resenders, and private telephone networks. During the 2008 terrorist attacks in Mumbai, for example, co-conspirators used television reports of police activity to keep the gunmen at various sites informed, using Internet telephones that were hard to trace.

As criminals adopt new technologies, law enforcement agencies predictably call for new surveillance powers. China alone employs more than 30,000 “Internet police” to monitor online traffic, what is sometimes known as the “Great Firewall of China.” The government apparently intercepts all Chinese-bound text messages and scans them for restricted words including democracy, earthquake, and milk powder.

The words are removed from the messages, and a copy of the original along with identifying information is stored on the government’s system. When Canadian human rights activists recently hacked into Chinese government networks they discovered a cluster of message-logging computers that had recorded more than a million censored messages.

Netizens, increasingly fearful that the arms race between law enforcement and criminals will claim their privacy rights as unintended victims, are caught in the middle. Those fears became palpable after the September 11, 2001, terrorist attacks and those that followed in Indonesia, London, and Madrid.  The world is now engaged in a war with no measurable objectives for winning, fought against an anonymous and technologically savvy enemy who recruits, trains, and plans assaults largely through international communication networks. Security and surveillance of all varieties are now global priorities, eroding privacy interests significantly.

The emphasis on security over privacy is likely to be felt for decades to come. Some of the loss has already been felt in the real world. To protect ourselves from future attacks, everyone can now expect more invasive surveillance of their activities, whether through massive networks of closed-circuit TV cameras in large cities or increased screening of people and luggage during air travel.

The erosion of privacy is even more severe online. Intelligence is seen as the most effective weapon in a war against terrorists. With or without authorization, law enforcement agencies around the world have been monitoring large quantities of the world’s Internet data traffic. Title III has been extended to private networks and Internet phone companies, who must now insert government access points into their networks. (The FCC has proposed adding other providers of phone service, including universities and large corporations.)

Because of difficulties in isolating electronic communications associated with a single IP address, investigators now demand the complete traffic of large segments of addresses, that is, of many users. Data mining technology is applied after the fact to search the intercepted information for the relevant evidence.

Passed soon after 9/11, the USA Patriot Act went much further. The Patriot Act abandoned many of the hard-fought controls on electronic surveillance built into Title III. New “enhanced surveillance procedures” allow any judge to authorize electronic surveillance and lower the standard for warrants to seize voice mails.

The FBI was given the power to conduct wiretaps without warrants and to issue so-called national security letters to gag network operators from revealing their forced cooperation. Under a 2006 extension, FBI officials were given the power to issue NSLs that silenced the recipient forever, backed up with a penalty of up to five years in prison.

Gone is even a hint of the Supreme Court’s long-standing admonitions that search and seizure of information should be the investigatory tool of last resort.

Despite the relaxed rules, or perhaps inspired by them, the FBI acknowledged in 2007 that it had violated Title III and the Patriot Act repeatedly, illegally searching the telephone, Internet, and financial records of an unknown number of Americans. A Justice Department investigation found that from 2002 to 2005 the bureau had issued nearly 150,000 NSLs, a number the bureau had grossly under-reported to Congress.

Many of these letters violated even the relaxed requirements of the Patriot Act. The FBI habitually requested not only a suspect’s data but also those of people with whom he maintained regular contact—his “community of interest,” as the agency called it. “How could this happen?” FBI director Robert Mueller asked himself at the 2007 Senate hearings on the report. Mueller didn’t offer an answer.

Ultimately, a federal judge declared the FBI’s use of NSLs unconstitutional on free-speech grounds, a decision that is still on appeal. The National Security Agency, which gathers foreign intelligence, undertook an even more disturbing expansion of its electronic surveillance powers.

Since the Constitution applies only within the U.S., foreign intelligence agencies are not required to operate within the limits of Title III. Instead, their information- gathering practices are held to a much more relaxed standard specified in the Foreign Intelligence Surveillance Act. FISA allows warrantless wiretaps anytime that intercepted communications do not include a U.S. citizen and when the communications are not conducted through U.S. networks. (The latter restriction was removed in 2008.)

Even these minimal requirements proved too restrictive for the agency. Concerned that U.S. operatives were organizing terrorist attacks electronically with overseas collaborators, President Bush authorized the NSA to bypass FISA and conduct warrantless electronic surveillance at will as long as one of the parties to the information exchange was believed to be outside the United States.

Some of the president’s staunchest allies found the NSA’s plan, dubbed the Terrorist Surveillance Program, of dubious legality. Just before the program became public in 2005, senior officials in the Justice Department refused to reauthorize it.

In a bizarre real-world game of cloak-and-dagger, presidential aides, including future attorney general Alberto Gonzales, rushed to the hospital room of then-attorney general John Ashcroft, who was seriously ill, in hopes of getting him to overrule his staff. Justice Department officials got wind of the end run and managed to get to Ashcroft first. Ashcroft, who was barely able to speak from painkillers, sided with his staff.

Many top officials, including Ashcroft and FBI director Mueller, threatened to resign over the incident. President Bush agreed to stop bypassing the FISA procedure and seek a change in the law to allow the NSA more flexibility. Congress eventually granted his request.

The NSA’s machinations were both clumsy and dangerous. Still, I confess to having considerable sympathy for those trying to obtain actionable intelligence from online activity. Post-9/11 assessments revealed embarrassing holes in the technological capabilities of most intelligence agencies worldwide. (Admittedly, it also revealed repeated failures to act on intelligence that was already collected.) Initially at least, the public demanded tougher measures to avoid future attacks.

Keeping pace with international terror organizations and still following national laws, however, is increasingly difficult. For one thing, communications of all kinds are quickly migrating to the cheaper and more open architecture of the Internet. An unintended consequence of this change is that the nationalities of those involved in intercepted communications are increasingly difficult to determine.

E-mail addresses and instant-message IDs don’t tell you the citizenship or even the location of the sender or receiver. Even telephone numbers don’t necessarily reveal a physical location. Internet telephone services such as Skype give their customers U.S. phone numbers regardless of their actual location. Without knowing the nationality of a suspect, it is hard to know what rights she is entitled to.

The architecture of the Internet raises even more obstacles against effective surveillance. Traditional telephone calls take place over a dedicated circuit connecting the caller and the person being called, making wiretaps relatively easy to establish. Only the cooperation of the suspect’s local exchange is required.

The Internet, however, operates as a single global exchange. E-mails, voice, video, and data files—whatever is being sent is broken into small packets of data. Each packet follows its own path between connected computers, largely determined by data traffic patterns present at the time of the communication.

Data may travel around the world even if its destination is local, crossing dozens of national borders along the way. It is only on the receiving end that the packets are reassembled.

This design, the genius of the Internet, improves network efficiency. It also provides a significant advantage to anyone trying to hide his activities. On the other hand, NSLs and warrantless wiretapping on the scale apparently conducted by the NSA move us frighteningly close to the “general warrant” American colonists rejected in the Fourth Amendment. They were right to revolt over the unchecked power of an executive to do what it wants, whether in the name of orderly government, tax collection, or antiterrorism.

In trying to protect its citizens against future terror attacks, the secret operations of the U.S. government abandoned core principles of the Constitution. Even with the best intentions, governments that operate in secrecy and without judicial oversight quickly descend into totalitarianism. Only the intervention of corporate whistle-blowers, conscientious government officials, courts, and a free press brought the United States back from the brink of a different kind of terrorism.

Internet businesses may be entirely supportive of government efforts to improve the technology of policing. A society governed by laws is efficient, and efficiency is good for business. At the same time, no one is immune from the pressures of anxious customers who worry that the information they provide will be quietly delivered to whichever regulator asks for it. Secret surveillance raises the level of customer paranoia, leading rational businesses to avoid countries whose practices are not transparent.

Partly in response to the NSA program, companies and network operators are increasingly routing information flow around U.S. networks, fearing that even transient communications might be subject to large-scale collection and mining operations by law enforcement agencies. But aside from using private networks and storing data offshore, routing transmissions to avoid some locations is as hard to do as forcing them through a particular network or node.

The real guarantor of privacy in our digital lives may not be the rule of law. The Fourth Amendment and its counterparts work in the physical world, after all, because tangible property cannot be searched and seized in secret. Information, however, can be intercepted and copied without anyone knowing it. You may never know when or by whom your privacy has been invaded. That is what makes electronic surveillance more dangerous than traditional investigations, as the Supreme Court realized as early as 1967.

In the uneasy balance between the right to privacy and the needs of law enforcement, the scales are increasingly held by the Law of Disruption. More devices, more users, more computing power: the sheer volume of information and the rapid evolution of how it can be exchanged have created an ocean of data. Much of it can be captured, deciphered, and analyzed only with great (that is, expensive) effort. Moore’s Law lowers the costs to communicate, raising the costs for governments interested in the content of those communications.

The kind of electronic surveillance performed by the Chinese government is outrageous in its scope, but only the clumsiness of its technical implementation exposed it. Even if governments want to know everything that happens in our digital lives, and even if the law allows them or is currently powerless to stop them, there isn’t enough technology at their disposal to do it, or at least to do it secretly.

So far.

]]>
https://techliberation.com/2013/06/10/the-medias-sound-and-fury-over-nsa-surveillance/feed/ 1 44926
CFAA and Prosecutorial Indiscretion https://techliberation.com/2013/04/05/cfaa-and-prosecutorial-indiscretion/ https://techliberation.com/2013/04/05/cfaa-and-prosecutorial-indiscretion/#comments Fri, 05 Apr 2013 20:32:50 +0000 http://techliberation.com/?p=44447

With renewed interest in the failings of the Computer Fraud and Abuse Act and the role of prosecutorial discretion in its application in light of the tragic outcome in the Aaron Swartz case, I went back to what I wrote about the law in 2009.

Back then, the victim of both the poorly-drafted amendments to CFAA that expanded its scope from government to private computer networks and the politically-motivated zeal of federal prosecutors reaching for something—anything—with which to punish otherwise legal but disfavored behavior was trained on Lori Drew, a far less sympathetic defendant.

But the dangers lurking in the CFAA were just as visible in 2009 as they are today.  Those who have recently picked up the banner calling for reform of the law might ask themselves where they were back then, and why the ultimately unsuccessful Drew prosecution didn’t raise their hackles at the time.

The law was just as bad in 2009, and just as dangerously twisted by the government.  Indeed, the Drew case, as I wrote at the time, gave all the notice anyone needed of what was to come later.

Here’s the section of The Laws of Disruption from 2009 discussing CFAA:

What did Lori Drew do?

The late-forties suburban St. Louis mother was apparently unhappy about the “mean” behavior of Megan Meier, a thirteen-year-old former friend of Drew’s daughter Sarah. The Drews, along with Ashley Grills, the eighteen-year-old employee of Lori Drew’s home business, hatched a plan. They created a fake MySpace profile for a bare-chested sixteen-year-old boy named “Josh,” who would befriend Megan and encourage her to gossip about other girls. Then they would take printouts to Megan’s mother to show her what the girl was up to.

Not only was the idea stupid, it wasn’t even original—Sarah and Megan, back when they were friends, had done the same thing, creating a profile for a boy who didn’t exist as a way to talk to other boys. This time, however, the plan went awry. Megan became deeply infatuated with Josh. She pressed for his phone number. She wanted to meet him in person. The women behind his account looked for a way out.

According to Grills, “We decided to be mean to her so she would leave him alone . . . and we could get rid of the page.” After deliberating on the easiest way to end an ill-conceived hoax that was going very wrong, Grills sent an instant message to Meier: “The world would be a better place without you.”

The consequences were tragic. Meier, who was being treated for depression, took the suggestion all too literally. After an argument with her parents, who had closely monitored the relationship with Josh from the beginning, Meier went to her room and hanged herself.

Media accounts of the teen’s suicide and the subsequent revelation of who was behind “Josh” created a froth of outrage and hand-wringing. Commentators invented and then proclaimed an epidemic of “cyberbullying.”

When it became clear that the mother of one of Meier’s former friends was involved, Drew herself was subjected to death threats and vandalism. A fake MySpace page for her husband was created. On cable news and the blogosphere, Drew was instantly convicted and sentenced to hell. (“Call me vindictive,” a typical blog entry read, “but i hope that someone kills the woman who is responsible.”)

In the midst of the media storm, state attorneys in Missouri announced there would be no prosecution of Drew for the simple reason that no criminal law had been broken. Federal prosecutors weren’t so sure. They found a 1986 law, the Computer Fraud and Abuse Act, that set stiff penalties for breaking into and damaging computers.

Drew was charged under the novel theory that since the MySpace terms of service agreement prohibits posting false information in one’s profile, the creation of Josh violated Drew’s contract. Hence, she “accessed” MySpace computers without “authorization.” The creation of Josh, in other words, was a kind of hacking. The victim was not Meier (who with her parents’ permission had also violated the TOS, which requires users to be at least fourteen years old). The victim was MySpace.

Although the jury ultimately refused to convict Drew on the felony charge, they did convict her of the lesser crime of unauthorized access. Valentina Kunasz, the jury’s foreperson, made no apologies for the conviction. “It was so very childish; so very pathetic,” she told reporters after the trial. “She could have done quite a few things to stop it, and she chose not to. And I think she got kind of a rise out of doing this to another person and that bothers me, it really irks me.” Drew faces up to three years in prison and $300,000 in fines.

Legal scholars were generally in agreement that the prosecution was deeply flawed and will very likely be set aside or reversed on appeal. (N.B.  Later, it was.) First, there were gaping holes in the government’s case. For one thing, it was Grills, and not Drew, who set up the Josh account and therefore agreed to the TOS (Grills, testifying for the prosecution in exchange for immunity, admitted she never read the TOS). Drew herself was only occasionally involved in the hoax.

By a weird twist of irony, one of the few times she communicated with Meier it turned out she was talking to Meier’s mother, who told Josh he ought to be looking for friends his own age. The fateful message was sent by Grills without Drew’s knowledge, and wasn’t even sent through MySpace.

As a matter of public policy, the prosecution is even more disturbing. Even assuming Drew was bound by the TOS, these contracts are notoriously long and intentionally unreadable. Most of us, even lawyers, don’t read them.

Yet following the logic of the Drew prosecution, anyone who misrepresents some of their personal details on an online dating service has committed a federal crime. Anyone who gives a nonworking telephone number when signing up for a Web site has committed a federal crime.

Indeed, after the verdict, one social network researcher was pained to admit, “We’ve been telling our kids to lie about ID information for a long time now.”

The computer fraud law began as a protection against hackers targeting government computers. The law has never before been used in connection with the violation, willful or otherwise, of private terms of service. There’s no reason to believe Congress intended to criminalize cyberbullying in 1986 or any other time.

Supporters of the conviction argue that the real problem here was a hole in the law—the lack of a statute outlawing whatever it was Lori Drew had done.  But the decision of lawmakers not to criminalize a behavior is no reason to correct the problem in a way that undermines the very idea of law. People are often cruel to each other. Other children, adults, and even parents can and do humiliate children in the real world. No laws, in all but extreme cases, are being broken.

It’s difficult to see how this case differs in any respect other than the use of a computer and the tragic outcome.

If the conviction stands, it effectively gives every federal prosecutor a blank check to charge anyone they want with criminal behavior, subject only to their discretion of whether and when to use that power.

Some commentators, pleased with the result if not the process, argued that there was no cause for alarm. Prosecutors, they said, will only use this power in extreme cases.

The Drew prosecution suggests precisely the opposite. For elected prosecutors, the real temptation is to exercise discretion not when the law would otherwise let a heinous crime slip through the cracks but when passions are high and the facts (at least the version presented by the media) are the most lurid—when, in other words, an angry mob demands it.

]]>
https://techliberation.com/2013/04/05/cfaa-and-prosecutorial-indiscretion/feed/ 6 44447
The FCC at the Crossroads https://techliberation.com/2013/03/14/the-fcc-at-the-crossroads/ https://techliberation.com/2013/03/14/the-fcc-at-the-crossroads/#respond Thu, 14 Mar 2013 14:48:11 +0000 http://techliberation.com/?p=44052

crossroadsTuesday was a big day for the FCC.  The Senate Commerce, Science and Transportation Committee held an oversight hearing with all five Commissioners, the same day that reply comments were due on the design of eventual “incentive auctions” for over-the-air broadcast spectrum.  And the proposed merger of T-Mobile USA and MetroPCS was approved.

All this activity reflects the stark reality that the Commission stands at a crossroads.  As once-separate wired and wireless communications networks for voice, video, and data converge on the single IP standard, and as mobile users continue to demonstrate insatiable demand for bandwidth for new apps, the FCC can serve as midwife in the transition to next-generation networks.  Or, the agency can put on the blinkers and mechanically apply rules and regulations designed for a by-gone era.

FCC Chairman Julius Genachowski, for one, believes the agency is clearly on the side of the future.  In an op-ed last week in the Wall Street Journal, the Chairman took justifiable pride in the focus his agency has demonstrated in advancing America’s broadband advantage, particularly for mobile users.

Mobile broadband has clearly been a bright spot in an otherwise bleak economy.  Network providers and their investors, according to the FCC’s most recent analysis, have spent over a trillion dollars since 1996 building next-generation mobile networks, today based on 4G LTE technology.

These investments are essential for high-bandwidth smartphones and tablet devices and the remarkable ecosystem of voice, video, and data apps they have enabled.  This platform for disruptive innovation has powered a level of “creative destruction” that would do Joseph Schumpeter proud.

Mobile disruptors, however, are entirely dependent on the continued availability of new radio spectrum.  In the first five years following the 2007 introduction of the iPhone, mobile data traffic increased 20,000%.  No surprise, then, that the FCC’s 2010 National Broadband Plan conservatively estimated that mobile consumers desperately needed an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020.

With nearly all usable spectrum long-since allocated, the Plan acknowledged the need for creative new strategies for repurposing existing allocations to maximize the public interest.  But some current licensees including over-the-air television broadcasters and the federal government itself are resisting Chairman Genachowski’s efforts to keep the spectrum pipeline open and flowing.

So far, despite bold plans from the FCC for new unlicensed uses of TV “white spaces” and the  passage early in 2012 of “incentive auction” legislation from Congress, almost no new spectrum has been made available for mobile consumers.  The last significant auction the agency conducted was in 2008, based on capacity freed up in the digital television transition.

The “shared” spectrum the agency has recently been touting would have to be shared with the Department of Defense and other federal agencies, which have so far stonewalled a 2010 Executive Order from President Obama to vacate its unused or underutilized allocations.  (The federal government is, by far, the largest holder of usable spectrum today, with as much as 60% of the total.)

And after over a year of on-going design, there is still no timetable for the incentive auctions.  Last week, FCC Commissioner Jessica Rosenworcel, speaking to the National Association of Broadcasters, urged her colleagues at least to pencil in some dates.  But even in the best-case scenario, it will be years before significant new spectrum comes online for mobile devices.  The statute gives the agency until 2022.

In the interim, the mobile revolution has been kept alive by creative use of secondary markets, where mobile providers have bought and sold existing licenses to optimize current allocations, and by mergers and acquisitions, which allow network operators to combine spectrum and towers to improve coverage and efficiency.  Many transactions have been approved, but others have not.  Efforts to reallocate or reassign underutilized satellite spectrum are languishing in regulatory limbo.  Local zoning bodies continue to slow or refuse permission for the installation of new equipment.  Delays are endemic.

So even as the FCC pursues its visionary long-term plan for spectrum reform, the agency must redouble efforts to encourage optimal use of existing resources.  The agency and the Department of Justice must accelerate review of secondary market transactions, and place the immediate needs of mobile users ahead of hypothetical competitive harms that have yet to emerge.

In conducting the incentive auctions, unrelated conditions and pet projects need to be kept out of the mix, and qualified bidders must not be artificially limited to advance vague policy objectives that have previously spoiled some auctions and unnecessarily depressed prices on others.

Let’s hope Congress holds Chairman Genachowski to his promise to “[keep] discussions focused on solving problems, and on facts and data….so that innovation, private investment and jobs follow.”  We badly need all three.

(A condensed version of this essay appears today in Roll Call.)

]]>
https://techliberation.com/2013/03/14/the-fcc-at-the-crossroads/feed/ 0 44052
What “Big Bang Disruption” Says About Technology Policy https://techliberation.com/2013/02/18/what-big-bang-disruption-says-about-technology-policy/ https://techliberation.com/2013/02/18/what-big-bang-disruption-says-about-technology-policy/#comments Mon, 18 Feb 2013 06:06:38 +0000 http://techliberation.com/?p=43737

In the upcoming issue of Harvard Business Review, my colleague Paul Nunes at Accenture’s Institute for High Performance and I are publishing the first of many articles from an on-going research project on what we are calling “Big Bang Disruption.”

The project is looking at the emerging ecosystem for innovation based on disruptive technologies.  It expands on work we have done separately and now together over the last fifteen years.

Our chief finding is that the nature of innovation has changed dramatically, calling into question much of the conventional wisdom on business strategy and competition, especially in information-intensive industries–which is to say, these days, every industry.

The drivers of this new ecosystem are ever-cheaper, faster, and smaller computing devices, cloud-based virtualization, crowdsourced financing, collaborative development and marketing, and the proliferation of mobile everything.  There will soon be more smartphones sold than there are people in the world.  And before long, each of over one trillion items in commerce will be added to the network.

The result is that new innovations now enter the market cheaper, better, and more customizable than products and services they challenge.  (For example, smartphone-based navigation apps versus standalone GPS devices.)  In the strategy literature, such innovation would be characterized as thoroughly “undiscplined.”  It shouldn’t succeed.  But it does.

So when the disruptor arrives and takes off with a bang, often after a series of low-cost, failed experiments, incumbents have no time for a competitive response.  The old rules for dealing with disruptive technologies, most famously from the work of Harvard’s Clayton Christensen, have become counter-productive.   If incumbents haven’t learned to read the new tea leaves ahead of time, it’s game over.

The HBR article doesn’t go into much depth on the policy implications of this new innovation model, but the book we are now writing will.  The answer should be obvious.

This radical new model for product and service introduction underscores the robustness of market behaviors that quickly and efficiently correct many transient examples of dominance, especially in high-tech markets.

As a general rule (though obviously not one without exceptions), the big bang phenomenon further weakens the case for regulatory intervention.  Market dominance is sustainable for ever-shorter periods of time, with little opportunity for incumbents to exploit it.

Quickly and efficiently, a predictable next wave of technology will likely put a quick and definitive end to any “information empires” that have formed from the last generation of technologies.

Or, at the very least, do so more quickly and more cost-effectively than alternative solutions from regulation.  The law, to paraphrase Mark Twain, will still be putting its shoes on while the big bang disruptor has spread halfway around the world.

Unfortunately, much of the contemporary literature on competition policy from legal academics is woefully ignorant of even the conventional wisdom on strategy, not to mention the engineering realities of disruptive technologies already in the market.  Looking at markets solely through the lens of legal theory is, truly, an academic exercise, one with increasingly limited real-world applications.

Indeed, we can think of many examples where legacy regulation actually makes it harder for the incumbents to adapt as quickly as necessary in order to survive the explosive arrival of a big bang disruptor.  But that is a story for another day.

Much more to come.

Related links:

  1. Creating a ‘Politics of Abundance’ to Match Technology Innovation,” Forbes.com.
  2. Why Best Buy is Going out of Business…Gradually,” Forbes.com.
  3. What Makes an Idea a Meme?“, Forbes.com
  4. The Five Most Disruptive Technologies at CES 2013,” Forbes.com
]]>
https://techliberation.com/2013/02/18/what-big-bang-disruption-says-about-technology-policy/feed/ 3 43737
Toward a Technology “Watchful Waiting” Principle https://techliberation.com/2013/01/17/toward-a-technology-watchful-waiting-principle/ https://techliberation.com/2013/01/17/toward-a-technology-watchful-waiting-principle/#comments Thu, 17 Jan 2013 14:55:07 +0000 http://techliberation.com/?p=43462

When the smoke cleared and I found myself half caught-up on sleep, the information and sensory overload that was CES 2013 had ended.

There was a kind of split-personality to how I approached the event this year. Monday through Wednesday was spent in conference tracks, most of all the excellent Innovation Policy Summit put together by the Consumer Electronics Association. (Kudos again to Gary Shapiro, Michael Petricone and their team of logistics judo masters.)

The Summit has become an important annual event bringing together legislators, regulators, industry and advocates to help solidify the technology policy agenda for the coming year and, in this case, a new Congress.

I spent Thursday and Friday on the show floor, looking in particular for technologies that satisfy what I coined the The Law of Disruption: social, political, and economic systems change incrementally, but technology changes exponentially.

What I found, as I wrote in a long post-mortem for Forbes, is that such technologies are well-represented at CES, but are mostly found at the edges of the show–literally.

In small booths away from the mega-displays of the TV, automotive, smartphone, and computer vendors, in hospitality suites in nearby hotels, or even in sponsored and spontaneous hackathons going on around town, I found ample evidence of a new breed of innovation and innovators, whose efforts may yield nothing today or even in a year, but which could become sudden, overnight market disrupters.

Increasingly, it’s one or the other, which is saying something all by itself. For one thing, how do incumbents compete with such all or nothing innovations?

That, however, is a subject for another day.

For now, consider again the policy implications of such dramatic transformations. As those of us sitting in room N254 debated the finer points of software patents, IP transition, copyright reform, and the misapplication of antitrust law to fast-changing technology industries (increasingly, that means ALL industries), just a few feet away the real world was changing under our feet.

The policy conference was notably tranquil this year, without such previous hot-button topics as net neutrality, SOPA, or the lack of progress on spectrum reform to generate antagonism among the participants. But as I wrote at the conclusion of last year’s Summit, at CES, the only law that really matters is Moore’s Law. Technology gets faster, smaller, and cheaper, not just predictably but exponentially.

As a result, the contrast between what the regulators talk about and what the innovators do gets more dramatic every year, accentuating the figurative if not the literal distance between the policy Summit and the show floor. I felt as if I had moved between two worlds, one that follows a dainty 19th century wind-up clock and the other that marks time using the Pebble watch, a fully-connected new timepiece funded entirely by Kickstarter.

The lesson for policymakers is sobering, and largely ignored. Humility, caution, and a Hippocratic-like oath of first-do-no-harm are, ironically, the most useful things regulators can do if, as they repeat at shorter intervals, their true goal is to spur innovation, create jobs, and rescue American entrepreneurialism.

The new wisdom is simple, deceptively so. Don’t intervene unless and until it’s clear that there is demonstrable harm to consumers (not competitors), that there’s a remedy for the harm that doesn’t make things, if only unintentionally, worse, and that the next batch of innovations won’t solve the problem more quickly and cheaply.

Or, as they say to new interns in the Emergency Room, “Don’t just do something. Stand there.”

That’s a hard lesson to learn for those of us who think we’re actually surgical policy geniuses, only to find increasingly we’re working with blood-letting and leeches.  And no anesthesia.

In some ways, it’s the opposite of an approach that Adam Thierer calls the Technology Precautionary Principle. Instead of panicking when new technologies raise new (but likely transient) issues, first try to let Moore’s Law sort it out, until and if it becomes crystal clear that it can’t. Instead of a hasty response, opt for a delayed response. Call it the Watchful Waiting Principle.

Not as much fun as fuming, ranting, and regulating at the first sign of chaos, of course, but far more helpful.

That, in any case, is the thread of my dispatches from Vegas:

  1. Telcos Race Toward an all-IP Future,” CNET
  2. At CES, Companies Large and Small Bash Broken Patent System, Forbes
  3. FCC, Stakeholders Align on Communications Policy—For Now,” CNET
  4. The Five Most Disruptive Technologies at CES 2013, Forbes
]]>
https://techliberation.com/2013/01/17/toward-a-technology-watchful-waiting-principle/feed/ 6 43462
Ending Transaction ‘Mission Creep’ at the FCC https://techliberation.com/2012/12/14/ending-transaction-mission-creep-at-the-fcc/ https://techliberation.com/2012/12/14/ending-transaction-mission-creep-at-the-fcc/#comments Fri, 14 Dec 2012 16:50:23 +0000 http://techliberation.com/?p=43318

by Larry Downes and Geoffrey A. Manne

Now that the election is over, the Federal Communications Commission is returning to the important but painfully slow business of updating its spectrum management policies for the 21st century. That includes a process the agency started in September to formalize its dangerously unstructured role in reviewing mergers and other large transactions in the communications industry.

This followed growing concern about “mission creep” at the FCC, which, in deals such as those between Comcast and NBCUniversal, AT&T and T-Mobile USA, and Verizon Wireless and SpectrumCo, has repeatedly been caught with its thumb on the scales of what is supposed to be a balance between private markets and what the Communications Act refers to as the “public interest.”

Commission reviews of private transactions are only growing more common—and more problematic. The mobile revolution is severely testing the FCC’s increasingly anachronistic approach to assigning licenses for radio frequencies in the first place, putting pressure on carriers to use mergers and other secondary market deals to obtain the bandwidth needed to satisfy exploding customer demand.

While the Department of Justice reviews these transactions under antitrust law, the FCC has the final say on the transfer of any and all spectrum licenses. Increasingly, the agency is using that limited authority to restructure communications markets, beltway-style, elevating the appearance of increased competition over the substance of an increasingly dynamic, consumer-driven mobile market.

Given the very different speeds at which Silicon Valley and Washington operate, the expanding scope of FCC intervention is increasingly doing more harm than good.

 

Deteriorating Track Record

We’re trapped in a vicious cycle: the commission’s mismanagement of the public airwaves is creating more opportunities for the agency to insert itself into the internet ecosystem, largely to fix problems caused by the FCC in the first place. That is happening despite the fact that Congress clearly and precisely circumscribed the agency’s authority here, a key reason the internet has blossomed while heavily regulated over-the-air broadcasting and wireline telephone fade into history.

Desperate for continued relevance, the FCC can’t resist the temptation to tinker with one of the only segments of the economy that is still growing and investing. The agency, for example, fretted over Comcast’s merger with NBCUniversal for 10 months, approving it only after imposing a 30-page list of conditions, including details about which channels had to be offered in which cable packages.

Regulating-by-merger-condition has become a popular sport at the FCC, one with dangerous consequences. While it conveniently allows the agency to get around the problem of intervening where it has no authority, the result is a regulatory crazy quilt with different rules applying to different companies in different markets. Consumers, the supposed beneficiaries of this micromanagement, cannot be expected to understand the resulting chaos.

For example, Comcast also agreed to abide by an enhanced set of “net neutrality” rules even if, as appears likely, a federal appeals court throws out the FCC’s 2010 industry-wide rulemaking for exceeding the agency’s jurisdiction. As with all voluntary concessions, Comcast’s acquiescence isn’t reviewable in court.

The FCC made an even bigger hash in its review of AT&T’s proposed merger with T-Mobile. Once it became clear that the FCC was bowing to political pressure to reject the deal, the companies pulled their applications for license transfers to focus on winning over the Department of Justice first. But FCC Chairman Julius Genachowski, determined to have his say, simply released an uncirculated draft of the agency’s analysis of the deal anyway.

The report found that the combination, as initially proposed, would control too much spectrum in too many local markets. But that was only after the formula, known as the “spectrum screen,” was manipulated to reduce substantially the amount of frequency included in the denominator. Hidden in a footnote, the report noted cryptically that the reduction was being made (and explained) in an unrelated order yet to be published.

When the other order was released months later, however, it made no mention of the change. It never actually happened. With the T-Mobile deal off the table, apparently, the chairman found it more expedient to leave the screen as it was, at least until further gerrymandering proved useful. Unwittingly, Genachowski had exposed his hand in rigging a supposedly objective test applied by a supposedly independent agency.

 

Leave it to the Experts

This amateurish behavior, unfortunately, is increasingly the norm at the FCC. Politics aside, part of the problem is that while federal antitrust regulators enforce statutes under a long line of interpretive case law, the FCC’s review of license transfers is governed by an undefined and largely untested public interest standard.

Now the commission is asking interested parties how, if at all, it needs to formalize its transaction review process, particularly the spectrum screen calculation it blatantly manipulated in the AT&T/T-Mobile review. It’s even asking whether it should re-impose a rigid cap on the amount of spectrum any one carrier can license, a bludgeon of a regulatory tool the agency wisely abandoned in 2003.

We have a better idea. Do away with easily forged formulae and proxies with no scientific relevance. Instead, review transactions in the broader context of a dynamic broadband ecosystem that is disciplined not only by inter-carrier competition, but increasingly by device makers, operating system providers, app makers and ultimately by consumers.

Every user with an iPhone 5 knows perfectly well how complex and competitive the mobile marketplace has become. It’s now time for the government to abandon its 19th century toolkit and look at actual data—data that the FCC already collects and dutifully reports, then ignores when political expediency beckons.

Thanks to the FCC’s endemic misadventures in spectrum management, we can expect more, not fewer, mergers—necessitating more, not fewer, commission reviews. Rather than expanding the agency’s unstructured approach to transaction reviews, we should be reining it in. As the FCC embarks on its analysis of T-Mobile’s takeover of MetroPCS and Sprint’s acquisition by SoftBank, it’s time to put an end to dangerous mission creep at the FCC.

That, at least, would better serve the public interest.

(Reprinted, with permission, from Bloomberg BNA Daily Report for Executives, Dec. 6, 2012.  Our recent paper on FCC transaction review can be found at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2163169.)

]]>
https://techliberation.com/2012/12/14/ending-transaction-mission-creep-at-the-fcc/feed/ 2 43318
Latest WCIT Leak Makes Explicit Russian Desire to Overturn ICANN https://techliberation.com/2012/11/18/latest-wcit-leak-makes-explicit-russian-desire-to-overturn-icann/ https://techliberation.com/2012/11/18/latest-wcit-leak-makes-explicit-russian-desire-to-overturn-icann/#comments Sun, 18 Nov 2012 20:26:40 +0000 http://techliberation.com/?p=42823

On Friday evening, I posted on CNET a detailed analysis of the most recent proposal to surface from the secretive upcoming World Conference on International Telecommunications, WCIT 12.  The conference will discuss updates to a 1988 UN treaty administered by the International Telecommunications Union, and throughout the year there have been reports that both governmental and non-governmental members of the ITU have been trying to use the rewrite to put the ITU squarely in the Internet business.

The Russian federation’s proposal, which was submitted to the ITU on Nov. 13 th, would explicitly bring “IP-based Networks” under the auspices of the ITU, and would in specific substantially if not completely change the role of ICANN in overseeing domain names and IP addresses.

According to the proposal, “Member States shall have the sovereign right to manage the Internet within their national territory, as well as to manage national Internet domain names.”  And a second revision, also aimed straight at the heart of today’s multi-stakeholder process, reads:  “Member States shall have equal rights in the international allocation of Internet addressing and identification resources.”

Of course the Russian Federation, along with other repressive governments, uses every opportunity to gain control over the free flow of information, and sees the Internet as it’s most formidable enemy.  Earlier this year, Prime Minister Vladimir Putin told ITU Secretary-General Hamadoun Toure that Russia was keen on the idea of “establishing international control over the Internet using the monitoring and supervisory capability of the International Telecommunications Union.”

As I point out in the CNET piece, the ITU’s claims that WCIT has nothing to do with Internet governance and that the agency itself has no stake in expanding its jurisdiction ring more hollow all the time.  Days after receiving the Russian proposal, the ITU wrote in a post on its blog that, “There have not been any proposals calling for a change from the bottom-up multistakeholder model of Internet governance to an ITU-controlled model.”

This would appear to be an outright lie, and also a contradiction of an earlier acknowledgment by Dr. Touré.  In a September interview, Toure told Bloomberg BNA that “Internet Governance as we know it today,” concerns only “Domain Names and addresses.  These are issues that we’re not talking about at all,” Touré said. “We’re not pushing that, we don’t need to.”

The BNA article continues:

Touré, expanding on his emailed remarks, told BNA that the proposals that appear to involve the ITU in internet numbering and addressing were preliminary and subject to change.

‘These are preliminary proposals,’ he said, ‘and I suspect that someone else will bring another counterproposal to this, we will analyze it and say yes, this is going beyond, and we’ll stop it.’

Another tidbit from the BNA Interview that now seems ironic:

Touré disagreed with the suggestion that numerous proposals to add a new section 3.5 to the ITRs might have the effect of expanding the treaty to internet governance.

‘That is telecommunication numbering,’ he said, something that preceded the internet. Some people, Touré added, will hijack a country code and open a phone line for pornography. ‘These are the types of things we are talking about, and they came before the internet.’

I haven’t seen all of the proposals, of course, which are technically secret.   But the Russian proposal’s most outrageous amendments are contained in a proposed new section 3A, which is titled, “IP-based Networks.”

There’s more on the ITU’s subterfuge in Friday’s CNET piece, as well as these earlier posts:

1.  “Why is the UN Trying to Take Over the Internet?” Forbes.com, Aug 9, 2012.

2.  “UN Agency Reassures:  We Just Want to Break the Internet, Not Take it Over,” Forbes.com, Oct. 1, 2012.

]]>
https://techliberation.com/2012/11/18/latest-wcit-leak-makes-explicit-russian-desire-to-overturn-icann/feed/ 14 42823
California Joins States Insulating VoIP Providers from Local Public Utility Regulators https://techliberation.com/2012/09/30/california-joins-states-insulating-voip-providers-from-local-public-utility-regulators/ https://techliberation.com/2012/09/30/california-joins-states-insulating-voip-providers-from-local-public-utility-regulators/#comments Sun, 30 Sep 2012 22:35:47 +0000 http://techliberation.com/?p=42467

On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.

California now joins over twenty states that have enacted similar legislation.

The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February.  It passed both houses of the California legislature with wide bi-partisan majorities.

California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.

Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime.

Historically, as I argued in a Forbes piece last month, the imposition of public utility status on a service provider has been an extreme response to an extreme situation—a monopoly provider, unlikely to have competition because of the high cost of building  and operating competing infrastructure (so-called “natural monopoly”), offering a service that is indispensable to everyday life.

Service providers meeting that definition are transformed by PUC oversight into entities that are much closer to government agencies than private companies.  The PUC sets and modifies the utility’s pricing in excruciating detail.  PUC approval is required for each and every change or improvement to the utility’s asset base, or to add new services or retire obsolete offerings.

In exchange for offering service to all residents, utilities in turn are granted eminent domain and rights of way to lay and maintain pipes, wires and other infrastructure.

VoIP services may resemble traditional switched telephone networks, but they have none of the features of a traditional public utility.  Most do not even charge for basic service, nor do they rely on their own dedicated infrastructure.  Indeed, the reason VoIP is so much cheaper to offer than traditional telephony is that it can take advantage of the existing and ever-improving Internet as its delivery mechanism.

Because entry is cheap, VoIP providers have no monopoly, natural or otherwise.  In California, according to the FCC, residents have their choice of over 125 providers—more than enough competition to ensure market discipline.

Nor would residents be in any way helped by interposing a regulator to review and pre-approve each and every change to a VoIP provider’s service offerings.  Rather, the lightning-fast evolution of Internet services provides perhaps the worst mismatch possible for the deliberate and public processes of a local PUC.

Software developers don’t need eminent domain.

But the most serious mismatch between PUCs and VoIP providers is that there is little inherently local about VoIP offerings.  Where a case can be made for local oversight of public utilities operating extensive–even pervasive–local infrastructure, it’s hard to see what expertise a local PUC brings to the table in supervising a national or even international VoIP service.

On the other hand, it’s not hard to imagine the chaos and uncertainty VoIP providers and their customers would face if they had to satisfy fifty different state PUCs, not to mention municipal regulators and regulators in other countries.

In most cases that would mean dealing with regulators on a daily basis, on every minor aspect of a service offering.  In the typical PUC relationship, the regulator becomes the true customer and the residents mere “rate-payers” or even just “meters.”

Public utilities are not known for their constant innovation, and for good reason.

Whatever oversight VoIP providers require, local PUCs are clearly the wrong choice.  It’s no surprise, then, that SB 1161 was endorsed by major Silicon Valley trade groups, including TechNet, TechAmerica, and the Silicon Valley Leadership Group.

The law is a win for California residents and California businesses—both high-tech and otherwise.

Links

  1. Government Control of Net is Always a Bad Idea,” CNET News.com, June 4, 2012.
  2. Memo to Jerry Brown:  Sign SB 1161 for all Internet users,” CNET News.com, August 30, 2012.
  3. The Madness of Regulating VoIP as a Public Utility,” Forbes.com, Sept. 10, 2012.
  4. Brown Endorses Hands off Stance on Internet Calls,” The San Francisco Chronicle, Sept. 28. 2012.
]]>
https://techliberation.com/2012/09/30/california-joins-states-insulating-voip-providers-from-local-public-utility-regulators/feed/ 1 42467
What Google Fiber, Gig.U, and US Ignite say about Regulatory Waste and Overload https://techliberation.com/2012/08/06/what-google-fiber-gig-u-and-us-ignite-say-about-regulatory-waste-and-overload/ https://techliberation.com/2012/08/06/what-google-fiber-gig-u-and-us-ignite-say-about-regulatory-waste-and-overload/#comments Tue, 07 Aug 2012 00:30:21 +0000 http://techliberation.com/?p=41894

On Forbes today, I have a long article on the progress being made to build gigabit Internet testbeds in the U.S., particularly by Gig.U.

Gig.U is a consortium of research universities and their surrounding communities created a year ago by Blair Levin, an Aspen Institute Fellow and, recently, the principal architect of the FCC’s National Broadband Plan.  Its goal is to work with private companies to build ultra high-speed broadband networks with sustainable business models .

Gig.U, along with Google Fiber’s Kansas City project and the White House’s recently-announced US Ignite project, spring from similar origins and have similar goals.  Their general belief is that by building ultra high-speed broadband in selected communities, consumers, developers, network operators and investors will get a clear sense of the true value of Internet speeds that are 100 times as fast as those available today through high-speed cable-based networks.  And then go build a lot more of them.

Google Fiber, for example, announced last week that it would be offering fully-symmetrical 1 Gbps connections in Kansas City, perhaps as soon as next year.  (By comparison, my home broadband service from Xfinity is 10 Mbps download and considerably slower going up.)

US Ignite is encouraging public-private partnerships to build demonstration applications that could take advantage of next generation networks and near-universal adoption.  It is also looking at the most obvious regulatory impediments at the federal level that make fiber deployments unnecessarily complicated, painfully slow, and unduly expensive.

I think these projects are encouraging signs of native entrepreneurship focused on solving a worrisome problem:  the U.S. is nearing a dangerous stalemate in its communications infrastructure.  We have the technology and scale necessary to replace much of our legacy wireline phone networks with native IP broadband.  Right now, ultra high-speed broadband is technically possible by running fiber to the home.  Indeed, Verizon’s FiOS network currently delivers 300 Mbps broadband and is available to some 15 million homes.

But the kinds of visionary applications in smart grid, classroom-free education, advanced telemedicine, high-definition video, mobile backhaul and true teleworking that would make full use of a fiber network don’t really exist yet.  Consumers (and many businesses) aren’t demanding these speeds, and Wall Street isn’t especially interested in building ahead of demand.  There’s already plenty of dark fiber deployed, the legacy of earlier speculation that so far hasn’t paid off.

So the hope is that by deploying fiber to showcase communities and encouraging the development of demonstration applications, entrepreneurs and investors will get inspired to build next generation networks.

Let’s hope they’re right.

What interests me personally about the projects, however, is what they expose about regulatory disincentives that unnecessarily and perhaps fatally retard private investment in next-generation infrastructure.  In the Forbes piece, I note almost a dozen examples from the Google Fiber development agreement where Kansas City voluntarily waived permits, fees, and plodding processes that would otherwise delay the project.  As well, in several key areas the city actually commits to cooperate and collaborate with Google Fiber to expedite and promote the project.

As Levin notes, Kansas City isn’t offering any funding or general tax breaks to Google Fiber.  But the regulatory concessions, which implicitly acknowledge the heavy burden imposed on those who want to deploy new privately-funded infrastructure (many of them the legacy of the early days of cable TV deployments), may still be enough to “change the math,” as Levin puts it, making otherwise unprofitable investments justifiable after all.

Just removing some of the regulatory debris, in other words, might itself be enough to break the stalemate that makes building next generation IP networks unprofitable today.

The regulatory cost puts a heavy thumb on the side of the scale that discourages investment.  Indeed, as fellow Forbes contributor Elise Ackerman pointed out last week, Google has explicitly said that part of what made Kansas City attractive was the lack of excessive infrastructure regulation, and the willingness and ability of the city to waive or otherwise expedite the requirements that were on the books.(Despite the city’s promises to bend over backwards for the project, she notes, there have still been expensive regulatory delays that promoted no public values.)

Particularly painful to me was testimony by Google Vice President Milo Medin, who explained why none of the California-based proposals ever had a real chance.  “Many fine California city proposals for the Google Fiber project were ultimately passed over,” he told Congress, “in part because of the regulatory complexity here brought about by [the California Environmental Quality Act] and other rules. Other states have equivalent processes in place to protect the environment without causing such harm to business processes, and therefore create incentives for new services to be deployed there instead.”

Ouch.

This is a crucial insight.  Our next-generation communications infrastructure will surely come, when it does come, from private investment.  The National Broadband Plan estimated it would take $350 billion to get 100 Mbps Internet to 100 million Americans through a combination of fiber, cable, satellite and high-speed mobile networks.  Mindful of reality, however, the plan didn’t even bother to consider the possibility of full or even significant taxpayer funding to reach that goal.

Of course, nationwide fiber and mobile deployments by network operators including Verizon and AT&T can’t rely on gimmicks like Google Fiber’s hugely successful competition, where 1,100 communities applied to become a test site.  Nor can they, like Gig.U, cherry-pick research university towns, which have the most attractive demographics and density to start with.  Nor can they simply call themselves start-ups and negotiate the kind of freedom from regulation that Google and Gig.U’s membership can.

Large-scale network operators need to build, if not everywhere, than to an awful lot of somewheres.  That’s a political reality of their size and operating model, as well as the multi-layer regulatory environment in which they must operate.  And it’s a necessity of meeting the ambitious goal of near-universal high-speed broadband access, and of many of the applications that would use it.

Unlike South Korea, we aren’t geographically-small, with a largely urban population living in just a few cities.  We don’t have a largely- nationalized and taxpayer-subsidized communications infrastructure.   On a per-person basis, deploying broadband in the U.S. is much harder, complicated and more expensive than it is in many competing nations in the global economy.

Under the current regulatory and economic climate, large-scale fiber deployment has all but stopped for now.  Given the long lead-time for new construction, we need to find ways to restart it.

So everyone who agrees that universal broadband is a critical element in U.S. competitiveness in the next decade or so ought to look closely at the lessons, intended or otherwise, of the various testbed projects.  They are exposing in painful detail a dangerous and useless legacy of multi-level regulation and bureaucratic inefficiency that makes essential private infrastructure investment economically impossible.

Don’t get me wrong.  The demonstration projects and testbeds are great.  Google Fiber, Gig.U, and US Ignite are all valuable efforts.  But if we want to overcome our “strategic bandwidth deficit,” we’ll need something more fundamental than high-profile projects and demonstration applications.  Most of all, we need a serious housecleaning of legacy regulation at the federal, state, and local level.

Regulatory reform might not be as sexy as gigabit Internet demonstrations, but the latter ultimately won’t make much difference without the former.  Time to break out the heavy demolition equipment—for both.

]]>
https://techliberation.com/2012/08/06/what-google-fiber-gig-u-and-us-ignite-say-about-regulatory-waste-and-overload/feed/ 5 41894
The Feds Play the Spectrum Shell Game https://techliberation.com/2012/07/31/the-feds-play-the-spectrum-shell-game/ https://techliberation.com/2012/07/31/the-feds-play-the-spectrum-shell-game/#comments Tue, 31 Jul 2012 15:33:00 +0000 http://techliberation.com/?p=41847

On CNET today, I’ve posted a long critique of the recent report by the President’s Council of Advisors on Science and Technology (PCAST) urging the White House to reverse course on a two-year old order to free up more spectrum for mobile users.

In 2010, soon after the FCC’s National Broadband Plan raised alarms about the need for more spectrum for an explosion in mobile broadband use, President Obama issued a Memorandum ordering federal agencies to free up as much as 500 MHz. of radio frequencies currently assigned to them.

After a great deal of dawdling, the National Telecommunications and Information Administration, which oversees spectrum assignments within the federal government, issued a report earlier this year that seemed to offer progress. 95 MHz. of very attractive spectrum could in fact be cleared in the ten years called for by the White House.

But reading between the lines, it was clear that the 20 agencies involved in the plan had no serious intention of cooperating. Their cost estimates for relocation (which were simply reported by NTIA without any indication of how they’d been arrived at or even whether NTIA had been given any details) appeared to be based on an amount that would make any move economically impossible.

And the NTIA’s suggestion that some of the bands could be “shared” sounded appealing until the details revealed that the feds would place impossible conditions on that sharing.

In the end, the NTIA report was 200 pages of classic smoke-and-mirrors from an entrenched bureaucracy that is expert at avoiding change.

The PCAST report seemed to throw in the cards and accept the political reality that actual spectrum clearing in the federal bands would never happen. Instead, the President’s advisors doubled down on “sharing,” and called for a new “Spectrum Access System” that would be based on sharing technologies it admitted don’t exist yet.

SAS might be a better system in the long-term, but current technical and political limitations make such a system impractical. I argue in the piece that the NTIA and PCAST reports are just providing cover for federal agencies, notably the DoD and Justice, to avoid actually having to follow the President’s order and take aggressive steps to free up spectrum that is needed now. Whether this is intentional or not I leave to more savvy tea-leaf readers.

]]>
https://techliberation.com/2012/07/31/the-feds-play-the-spectrum-shell-game/feed/ 2 41847
Everyone Out of the Internet! https://techliberation.com/2012/06/08/everyone-out-of-the-internet/ https://techliberation.com/2012/06/08/everyone-out-of-the-internet/#comments Fri, 08 Jun 2012 18:14:54 +0000 http://techliberation.com/?p=41375

During the 1970’s, I remember a bumper sticker that summed up the prevailing anti-colonial attitude that had developed during the late 1960’s:  “U.S. Out of North America.”

That sentiment reflects nicely my activities this week, which include three articles decrying efforts by regulators to oversee key aspects of the Internet economy.  Of course their intentions—at least publicly—are always good.  But even with the right idea, the unintended negative consequences always overwhelm the benefits by a wide margin.

Governments are just too slow to respond to the pace of change of innovations in information technology.  Nothing will fix that.  So better just to leave well enough alone and intercede only when genuine consumer harm is occurring.  And provable.

The articles cover the spectrum from state (California), federal (FCC) and international (ITU) regulators and a wide range of  truly bad ideas, from the desire of California’s Public Utilities Commission to “protect” consumers of VoIP services, to the FCC’s latest effort to elbow its way into regulating broadband Internet access at the middle milel, to a proposal from European telcos to have the U.N. implement a tariff system on Internet traffic originating from the U.S.

 Here they are:

  1. “Government Control of the Net is Always a Bad Idea” (CNET) – http://news.cnet.com/8301-13578_3-57446383-38/government-control-of-net-is-always-a-bad-idea/?tag=mncol;cnetRiver
  2. “The FCC Noses Under the Broadband Internet Tent” (Forbes) – http://www.forbes.com/sites/larrydownes/2012/06/06/the-fcc-noses-under-the-broadband-internet-tent/
  3. “U.N. Could Tax U.S.-based Websites, Leaked Docs Show” (CNET) – http://news.cnet.com/8301-1009_3-57449375-83/u.n-could-tax-u.s.-based-web-sites-leaked-docs-show/?tag=mncol;topStories

That third one, by the way, was written with CNET’s Chief Political Correspondent Declan McCullagh.  It represents a genuine scoop, based on leaked documents posted by TLFers Jerry Brito and Eli Dourado on WCITLeaks.org!

]]>
https://techliberation.com/2012/06/08/everyone-out-of-the-internet/feed/ 8 41375
To Reach 98% Access for Mobile Broadband, Take the FCC Out of Equation https://techliberation.com/2012/05/28/to-reach-98-access-for-mobile-broadband-take-the-fcc-out-of-equation/ https://techliberation.com/2012/05/28/to-reach-98-access-for-mobile-broadband-take-the-fcc-out-of-equation/#comments Tue, 29 May 2012 01:05:22 +0000 http://techliberation.com/?p=41270

 (Adapted from Bloomberg BNA Daily Report for Executives, May 16th, 2012.)

Two years ago, the Federal Communications Commission’s National Broadband Plan raised alarms about the future of mobile broadband. Given unprecedented increases in consumer demand for new devices and new services, the agency said, network operators would need far more radio frequency assigned to them, and soon. Without additional spectrum, the report noted ominously, mobile networks could grind to a halt, hitting a wall as soon as 2015.

That’s one reason President Obama used last year’s State of the Union address to renew calls for the FCC and the National Telecommunications and Information Administration (NTIA) to take bold action, and to do so quickly. The White House, after all, had set an ambitious goal of making mobile broadband available to 98 percent of all Americans by 2016. To support that objective, the president told the agencies to identify quickly an additional 500 MHz of spectrum for mobile networks.

By auctioning that spectrum to network operators, the president noted, the deficit could be reduced by nearly $10 billion. That way, the Internet economy could not only be accelerated, but taxpayers would actually save money in the process.

A good plan. So how is it working out?

Unfortunately, the short answer is:  Not well.  Speaking this week at the annual meeting of the mobile trade group CTIA, FCC Chairman Julius Genachowski had to acknowledge the sad truth:  “the overall amount of spectrum available has not changed, except for steps we’re taking to add new spectrum on the market.”

The tortured grammar (how can “steps we’re taking to add new spectrum” constitute an exception to the statement that the amount of available spectrum “has not changed”?) belies the reality here—all the FCC Chairman can do is promise more spectrum sometime in the vague future.  For now, the FCC and the NTIA have put almost no new spectrum into actual use.  Instead,  the two agencies have piled up a depressing list of delays, scandals, and wasted opportunities. Consider just a few:

  • NTIA’s long-overdue report on freeing up government spectrum identified nearly 100 MHz of frequencies that could be reallocated for mobile broadband. But the 20 agencies involved in the study demanded 10 years and nearly $18 billion to vacate the spectrum—and insist on moving to frequencies that are already assigned to other public or private license holders. An available 20 MHz of unassigned frequency, left over from the 2009 conversion to digital TV, was actually added to the government’s supply when it was set aside  this year for a dedicated public safety network.
  • After years of wrangling with Congress, the FCC finally won limited authority to hold “voluntary incentive auctions” for spectrum currently licensed to over-the-air television broadcasters. But those auctions will take years to complete, and a decided lack of enthusiasm by broadcasters doesn’t portend well for the outcome.  As for reducing the deficit, the agency has reserved its right to disqualify bidders it believes already hold more spectrum than the agency thinks best to stimulate competition, even without any measurable signs of market failure. (Voice, data, and text prices continue to decline, according to the FCC’s own data.)
  • LightSquared’s efforts to reallocate satellite spectrum for use in a competitive new mobile broadband network were crippled—perhaps fatally–by concerns raised by the Department of Defense and others over potential interference with some global positioning system (GPS) devices.  Initial permission to proceed was swiftly revoked–after the company had invested billions.  The FCC’s procedural blunders in the LightSquared case ignited a political scandal that continues to distract the agency. A similar effort by Dish Networks is now being put through the full set of administrative hurdles, delayed at least until after the election..
  • Transactions in the secondary spectrum markets—long the only real source of supply for mobile network operators–have received an increasingly frosty reception. Last year, AT&T’s planned merger with T-Mobile USA was scuttled on the basis of dubious antitrust concerns the FCC backed up with data that was clumsily rigged by agency staff.  Now, the agency has expanded its review of Verizon’s efforts to buy spectrum from a consortium of cable companies—spectrum that currently isn’t being used for anything.
  • After the FCC mandated data roaming agreements even for carriers who hold spectrum in the same markets, Sprint announced it would stop serving customers with its own network in two metropolitan areas, piggybacking instead on AT&T’s band-new LTE facilities. Sprint’s move underscores concerns that mandatory roaming will reduce incentives for carriers to invest in infrastructure. According to the FCC, mobile industry investments have reached nearly 15 percent of total revenue in recent years. Of the leading providers, only Sprint decreased its investments during the recession.

Not an impressive showing, to say the least.  Meanwhile, in the real world, demand for mobile broadband continues to mushroom. Network usage has increased as much as 8,000%  since 2007, when Apple’s iPhone first hit the market. It was followed by an explosion of new devices, operating systems, and software apps from a cottage industry of developers large and small. This remarkable ecosystem is driving lightning-fast adoption of mobile services, especially bandwidth-intense video apps.

The mobile broadband ecosystem is one of the few bright spots in the sour economy, creating jobs and generating tax revenues. Makers of tablet computers, for example, expect to sell over 100 million units this year alone. Tablet users, by the way, already rely on the wildly popular devices for 15 percent of their TV viewing, raising the demand for high-bandwidth video services on existing mobile broadband networks.

Spectrum is the principal fuel of these fast-growing mobile applications. So FCC Chairman Julius Genachowski is right to repeatedly emphasize the catastrophic consequences of an imminent “spectrum crunch.”  The FCC is leading the chorus of doomsayers who believe that without more spectrum—and soon—our  mobile revolution will never reach its full economic, educational, and social potential.

But the government has done nothing to head off that disaster. Instead, the FCC, the NTIA, and the Obama administration continue to make policy choices that do little to get more spectrum into the system. If anything, we’re moving backwards.

Many of these decisions appear to be driven by short-term political imperatives, overriding the worthy  goal of making mobile broadband available to all Americans as quickly as possible. The AT&T/T-Mobile deal, for example, was killed simply because the FCC didn’t like the idea of taking even a failing carrier out of the competitive equation. Yet had the deal been approved, AT&T committed to deploy mobile broadband to 95 percent of all Americans—nearly meeting the president’s goal in a single stroke.

This is nothing new. The FCC has a very long and very messy history of using its spectrum management powers to shape emerging markets, and to pick winners and losers among new technologies,  applications, and providers.  Their guiding principle for nearly 100 years has been the so-called “public interest” standard—an undefined and highly-malleable policy tool the FCC employs like a bludgeon.

The era of micromanaging the airwaves by federal fiat must now end once and for all. For first time in a century of federal stewardship, there is almost no inventory of usable spectrum. It has all been allocated to some 50,000 public and private license holders, each the one-time favorite of the FCC. Our spectrum frontier has closed.  And it wouldn’t have closed so soon if the FCC hadn’t remained so determined to manage a 21 st century resource as if it were still the 19th century.

Technology may come to our rescue, at least in part. Hardware and software for sharing spectrum, switching frequencies, and maximizing the technical properties of different bandwidths continue to be part of the innovation agenda of the mobile industry. But it is unlikely these developments will be enough to keep spectrum supply even slightly ahead of unbridled consumer demand. Many of these technologies, in any case, still require FCC approval to be deployed. That means even more delays.

Saving the mobile ecosystem–and making way for the next generation of mobile innovation–demands a bold new strategy. For starters, it is time to stage an intervention for  federal agencies hoarding spectrum. Private licensees who no longer need the spectrum they have must be able to sell their rights quickly in a working market, and be prodded when needed to do so. Buyers need the freedom to repurpose spectrum to new uses.

Also, we need to increase incentives for network operators to continue investing in better and more efficient infrastructure, not throw cold water on them in the name of a vague and largely undefined public interest.   The number of competitors isn’t what matters.  It’s the ability of consumers to get what they want at prices that, at least up until now, continue to decline.

In short, we need to take the FCC out of the middle of every transaction and each innovation, slowing Silicon Valley-paced markets down to Washington speed.

With the appetite of mobile consumers growing more voracious, it is long past time for Congress to take a cold, sober look at our obsolete system for spectrum management and the antiquated agency that can’t stop fussing over it. We need a new system, if not a new FCC. That’s the only way to keep the mobile frontier booming, let alone meet the admirable goal of providing a homestead there for every American.

]]>
https://techliberation.com/2012/05/28/to-reach-98-access-for-mobile-broadband-take-the-fcc-out-of-equation/feed/ 1 41270
The Closing of the Spectrum Frontier https://techliberation.com/2012/04/19/the-closing-of-the-spectrum-frontier/ https://techliberation.com/2012/04/19/the-closing-of-the-spectrum-frontier/#respond Fri, 20 Apr 2012 02:22:32 +0000 http://techliberation.com/?p=40908

Frederick Jackson Turner (1861-1932)

On Fierce Mobile IT, I’ve posted a detailed analysis of the NTIA’s recent report on government spectrum holdings in the 1755-1850 MHz. range and the possibility of freeing up some or all of it for mobile broadband users.

The report follows from a 2010 White House directive issued shortly after the FCC’s National Broadband Plan was published, in which the FCC raised the alarm of an imminent “spectrum crunch” for mobile users.

By the FCC’s estimates, mobile broadband will need an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020, in order to satisfy increases in demand that have only amped up since the report was issued.  So far, only a small amount of additional spectrum has been allocated.  Increasingly, the FCC appears rudderless in efforts to supply the rest, and to do so in time.

It’s not entirely their fault.  At the core of the problem, the FCC is simply not constituted to resolve this increasingly urgent crisis.  That’s because, as I write in the article, the management of radio frequencies has entered new and unchartered territory.

For the first time since the FCC and its predecessor agencies began licensing spectrum nearly 100 years ago, there is no unassigned spectrum available, or at least none of which current technology can make effective use.

The spectrum frontier is now closed.  But the FCC, as created by Congress, is an agency that only functions at all on the frontier.

So it’s worth remembering what happened a hundred years earlier, when a young historian named Frederick Jackson Turner showed up at the 1893 annual meeting of the American Historical Association to present his paper on “The Significance of the Frontier in American History.”

The meeting took place that year on the grounds of the World’s Columbian Exposition in Chicago.  The weather was unspeakably hot, and Turner’s talk was poorly attended.  (The President of the AHA, Henry Adams, was in attendance but appears not to have heard Turner’s talk or ever to have read the paper—he was meditating in the Hall of Turbines, as he wrote in his autobiography, “The Education of Henry Adams,” having a nervous breakdown.)   But the paper has had an outsized and long-lasting impact, launching the field of western or frontier history.

Turner’s thesis was simple and unassailable.  Citing census data that showed there was no longer a recognizable line of American territory beyond which there was no settlement, Turner declared that by 1890 the frontier had “closed.”  The era of seemingly endless supplies of readily-available cheap land, dispensed for free or for nominal cost by the federal government, had come to an end.

For Turner, the history of the west was the history of the American experience.  And the defining feature of American life—shaping its laws, customs, culture and economy–had disappeared.  A new phase, with new rules, was beginning.

 

The FCC Only Functions, When it Functions at All, on the Frontier

Our problem, at least, is equally easy to describe.  The FCC, as created by Congress, is an agency that only functions, when it functions at all, on the frontier.

All the talk of “spectrum crunch” boils down to a simple but devastating fact:  it’s no longer possible to add capacity to existing mobile networks by assigning them unused ranges of radio frequencies.  While technology continues to expand the definition of “usable” frequencies, demand for mobile broadband is increasing faster than our ability to create new supply.

We need more spectrum.  And the only way to put more spectrum to use for the insatiable demands of mobile consumers is to reallocate spectrum that has already been licensed to someone else.

In the American west, reallocation of land was easy.  Land grants were given with full legal title, and holders were under no lasting obligation to use their land for any specific purpose or in any particular way.

The various acts of Congress that authorized the grants were intended to foster important social values—populating the frontier, developing agriculture, compensating freed slaves, building the railroads.  But those intentions were never translated into the kind of limited estates that plagued modern Europe after the feudal age came to an end.  (For a good example of the mischief a conditional estate can cause hundreds of years later, watch “Downton Abbey.”  Watch it even if you don’t want to see an example of inflexible estate law.)

Speculators sold to farmers, farmers to ranchers, ranchers to railroads and miners and oil drillers, and from there to developers of towns and other permanent settlements.  The market established the transfer price, and the government stood behind the change of title and its enforcement, where necessary.  Which was rarely.

So the closing of the western frontier, while it changed the nature of settlement in the American west, never threatened to bring future development to a screeching halt.

 

Reallocation Options are Few and Far Between

Unfortunately, spectrum licensing has never followed a property model, even though one was first proposed by Ronald Coase as early as 1959.  Under the FCC’s command-and-control model, spectrum assignments have historically been made to foster new technologies or new applications the FCC deems to be of good potential to advance national interests.  Spectrum has been licensed, usually at no or nominal cost to the licensor, for particular uses, with special conditions (often unrelated) attached.

In theory, of course, the FCC could begin revoking the licenses of public and private users who aren’t using the spectrum they already have, or who aren’t using it effectively or, to use the legal term of art, “in the public interest.”  Legally and politically, however, revoking (or even refusing to renew) licenses is a non-starter.

Consequently, the most disastrous side-effect of the “public interest” approach to licensing has been that when old technologies grow obsolete, there is no efficient way to reclaim the spectrum for new or more valuable uses.  The FCC must by law approve any transfer of an existing license on the secondary market, slowing the process at best and creating an opportunity to introduce new criteria and new conditions for the transfer at worst.

Even when the agency approves a transfer, the limitations on use and the existing conditions of the original licensor apply in full force to the new user.  That means that specific ranges of spectrum more-or-less arbitrarily set aside for a particular application remains forever set aside for that application, unless and until the FCC undertakes a rulemaking to reassign it.

That also takes time and effort, and offers the chance for new regulatory mischief.  (Only since 1999, the FCC has had the power, under limited circumstances, to grant flexible use licenses.  The power cannot be applied retroactively to existing licenses.)

With the spectrum frontier closed, mobile broadband providers must find additional capacity from existing license holders.  But because of the use restrictions and conditions, the universe of potential acquisition targets immediately and drastically shrinks to those making similar use of their licenses–that is, to current competitors.

So it’s no surprise that since 2005, as mobile use has exploded with the advent of 2G, 3G, and now 4G networks, the FCC has been called upon to approve over a dozen significant transfers within the mobile industry, including Sprint/Nextel, Verizon/Alltel, and Sprint Nextel/Clearwire.  Indeed, expanding capacity through merger seemed to be the agency’s preferred solution, and the one that required the least amount of time and effort.

But with the rejection last year of AT&T’s proposed merger with T-Mobile USA, the FCC has signaled that it no longer sees such transactions as a preferred or perhaps even potential avenue for acquiring additional capacity.  At least not for AT&T–and perhaps as well for Verizon, which is currently fighting to acquire unused spectrum held by a consortium of cable providers.

What other avenues are left?  With the approval of “voluntary incentive auction” legislation earlier this year, the FCC can now begin the process of gently coercing over-the-air television broadcasters to give up some or all of their licensed capacity in exchange for a share of the proceeds of any auctions the agency conducts to repurpose that spectrum for mobile broadband.

(Broadcast television seems the obvious place to start freeing up spectrum.  With the transition to digital TV, every station was given a 6 MHz. allocation in the 700 MHz. range.  But over-the-air viewership has collapsed to as few as 10% of homes in favor of cable and fiber systems, which today reach nearly every home in the country and offer far greater selection and services.  Many local broadcasters remain in business largely through the regulatory arbitrage of the FCC’s retransmission consent and must-carry rules.)

Those auctions will likely take years to complete, however, and the agency and Congress have already fallen out over how and how much the agency can “shape” the outcomes of these future auctions by disqualifying bidders who the agency feels already have too high a concentration of existing licenses.

And it’s far from clear that the broadcasters will be in any hurry to sign up, or that enough of them will to make the auctions worthwhile.  Participation is, at least so far, entirely voluntary.  Just getting Congress to agree to give the FCC even limited new auction authority took years.

There’s also the possibility of reassigning other kinds of spectrum to mobile use—increasing the pool of usable spectrum allocated to mobile, in other words.  That option, however, has also failed to produce results.  For example, the FCC initially gave start-up LightSquared a waiver that would allow it to repurpose unused spectrum allocated for satellite use for a new satellite and terrestrial-based LTE network.

But after concerns were raised by the Department of Defense and the GPS device industry about possible interference, the waiver was revoked and the company now stands on the brink of bankruptcy.  (Allegations of political favoritism in the granting of the waiver are holding up the nominations of two FCC commissioners.)

So when Dish Networks recently asked for a similar waiver, the agency traded speed and flexibility for the relative safety of  full process.  The FCC has now published a formal Notice of Proposed Rulemaking to evaluate the request.  If the rulemaking is approved, Dish will be able to repurpose satellite spectrum for a terrestrial mobile broadband network (possibly a wholesale network, rather than a new competitor).  That, of course, will take time.  And given enough time, anything can and will happen.

Finally, there’s the potential to free up unused or underutilized spectrum currently licensed to the federal government, one of the largest holders of usable spectrum and a notoriously poor manager of this valuable resource.

That was the subject of the NTIA’s recent report, which seemed to suggest that the high-priority 1755-1850 MHz. range (internationally targeted for mobile users) could be cleared of government users within ten years—some in five years, and in some cases, with possible sharing of public and private use during a transitional phase.

But as I point out in the article, the details behind that encouraging headline suggest rather that some if not all of the twenty agencies who currently hold some 1,300 assignments in this band are in no hurry to vacate it.  Having paid nothing for their allocations and with no option to get future auction proceeds earmarked to their agency, the feds have little incentive to do so.  (NTIA can’t make them do much of anything.)  The offer to share may in fact be a stalling tactic to ensure they never actually have to vacate the frequencies.

 

What’s Left?  Perhaps Nothing, at Least as Far as the FCC is Concerned

The color-coded map of current assignments is so complicated it can’t actually be read at all except on very large screens.  There are currently some 50,000 active licenses.  The agency still doesn’t even have a working inventory of them.  This is the legacy of the FCC’s command-and-control approach to spectrum allocation over nearly 100 years.

Almost everyone agrees that even with advances in hardware and software that make spectrum usage and sharing more efficient, large quantities of additional spectrum must be allocated soon if we want to keep the mobile ecosystem healthy and the mobile revolution in full and glorious swing.

With the closing of the spectrum frontier, the easy solutions have all been extinguished.  And the century-long licensing regime, which tolerated tremendous inefficiency and waste when spectrum was cheap, has left the FCC, the NTIA, the mobile industry and consumers dangerously hamstrung in finding alternative methods to meet demand.  Existing spectrum, by and large, can’t be repurposed even when everyone involved wants to do so and where the market would easily catalyze mutually-beneficial transactions.

Given the law as it stands and the FCC’s current policy choices, carriers can’t get spectrum from outside the mobile industry, nor can they get it from their competitors.  They can’t get it from the government, and may not be allowed to participate in future auctions of spectrum agonizingly pried loose from broadcasters who aren’t using what they have cost-effectively—assuming those auctions ever take place.  They also can’t put up more towers and antennae to make better use of what they have, thanks to the foot-dragging and NIMBY policies of local zoning authorities.

And even when network operators do get more usable spectrum, it comes burdened with inflexible use limits and unrelated conditions that attach like barnacles at every stage of the process—from assignment to auction to transfer—and which require regular reporting, oversight, and supervision by the FCC.

 

A New Approach to Spectrum Management–Following an Old Model that Worked

The frontier system for spectrum management is hopelessly and dangerously broken.  It cannot be repaired.  For the mobile broadband economy to continue its remarkable development (one bright spot throughout the sour economy), Congress and the FCC must transition quickly to a new model that makes sense in a world without a spectrum frontier.

That model would look much more like the 19th century system of federal land management than the FCC’s legacy command-and-control system.  The new approach would start by taking the FCC out of the middle of every transaction, and leave to the market to determine the best and highest use of our limited range of usable frequencies.  It would treat licenses as transferable property, just like federal land grants in the 18 th and 19th centuries.

It would leave to the market—with the legal system as backup—to work out problems of interference, just as the common law courts have stood as backup for land disputes.

And it would deal with any genuine problems of over-concentration (that is, those that cause demonstrable harm to consumers) through modern principles of antitrust applied by the Department of Justice, not the squishy and undefined “public interest” non-standard of the FCC.  It would correct problems once it was clear the market had failed to do so, not short-circuit the market at the first hint of theoretical trouble.  (Hello, net neutrality rules.)

That’s the system, according to Frederick Jackson Turner, that formed American culture and values, shaped American law and provided the fuel to create the engine of capitalism.

For starters.

 

]]>
https://techliberation.com/2012/04/19/the-closing-of-the-spectrum-frontier/feed/ 0 40908
LightSquared and Dish: What Would Coase Do? https://techliberation.com/2012/03/22/lightsquared-and-dish-what-would-coase-do/ https://techliberation.com/2012/03/22/lightsquared-and-dish-what-would-coase-do/#comments Thu, 22 Mar 2012 19:47:39 +0000 http://techliberation.com/?p=40445

On CNET today, I have a longish post on the FCC’s continued machinations over LightSquared and Dish Networks respective efforts to use existing satellite spectrum to build terrestrial mobile broadband networks. Both companies plan to build 4G LTE networks; LightSquared has already spent $4 billion in build-out for its network, which it plans to offer wholesale.

After first granting and then, a year later, revoking LightSquared’s waiver to repurpose its satellite spectrum, the agency has taken a more conservative (albeit slower) course with Dish. Yesterday, the agency initiated a Notice of Proposed Rulemaking that would, if adopted, assign flexible use rights to about 40 Mhz. of MSS spectrum licensed to Dish.

Current allocations of spectrum have little to do with the technical characteristics of different bands. That existing licenses limit Dish and LightSquared to satellite applications, for example, is simply an artifact of more-or-less random carve-outs to the absurdly complicated spectrum map managed by the agency since 1934. Advances in technology makes it possible to successfully use many different bands for many different purposes.

But the legacy of the FCC’s command-and-control model for allocations to favor “new” services (new, that is, until they are made obsolete in later years or decades) and shape competition to its changing whims is a confusing and unnecessary pile-up of limitations and conditions that severely and artificially limit the ways in which spectrum can be redeployed as technology and consumer demands change.  Today, the FCC sits squarely in the middle of each of over 50,000 licenses, a huge bottleneck that is making the imminent spectrum crisis in mobile broadband even worse.

Even with the best of intentions, the agency can’t possibly continue to micromanage the map.  And, as the LightSquared and Dish stroies demonstrate yet again, the risk of agency capture and political pressure often mean the agency doesn’t do the right thing even when it does act.

Who would be the more efficient and neutral regulator?  According to Nobel Prize-winning economist Ronald Coase’s seminal 1959 article, “The Federal Communications Commission,” the answer is the market.  In his trademark straight-forward, common sense style, Coase elegantly dismantles the idea that scarce spectrum resources demand a non-market solution of government management.

For one thing, Coase demonstrates how screwed up the system already was over fifty years ago.  There’s little doubt that the problems he describes have only gotten worse with time and increased demand on the airwaves by insatiable consumers.

Instead, Coase proposed to treat spectrum like any other industry input–as property.  The FCC, he said, should auction spectrum rights to the highest bidder, without licenses, conditions, or limitations on use, and then stand back.  (He acknowledged the risk of antitrust problems, but, as in any industry, such problems could be addressed by antitrust regulators and not the FCC.)  Spectrum rights would efficiently change hands when new applications and devices created higher-value uses.

Potential interference problems–such as those raised by GPS device manufacturers in the case of LightSquared–would be resolved precisely as they are in other property contexts.  Without an FCC to run to, the parties would be forced to negotiate against a backdrop of established liability rules and a safety net of potential litigation.  Indeed, LightSquared and GPS offer a classic example of Coase’s later work demonstrating that regardlesss of how property is initially allocated, liability rules ensure that parties will bargain to the most socially-efficient solution to interference.

Of course we’ll never know if the socially-optimal solution here is for LightSquared to protect GPS devices from receiving its signal or for device manufacturers to change their designs to stay out of LightSquared’s bands.  The heavy hand of the regulator has foreclosed a market solution, or even an attempt at negotiations.

Instead, we have the disaster of the FCC’s decision in January, 2011 to grant a conditional waiver to LightSquared and then, last month, indefinitely revoking it.  Meanwhile, LightSqaured spent $4 billion on infrastructure it may never use, and lost its CEO and key customers including Sprint.  No one is happy, and no one reasonably argue this is was an optimal outcome, or even close to one.

For Dish, the NPRM will ensure a more orderly process, but at the cost of months or perhaps longer delay before Dish can begin building its terrestrial network.  And in the interim, all sorts of irrelevant issues may interfere with the orderly (and expensive) resolution.

When Coase proposed a property model for spectrum in 1959, the idea was considered too radical.  Congress and the FCC have, slowly but surely, taken pieces of the proposal to heart, introducing auctions (but not property rights) in the 1990’s.  Yesterday’s NPRM takes a small step toward more flexible use licenses, but this may be too little reform too late.  We have all the evidence we need that micromanagement of spectrum can’t possibly keep up with the pace of innovation.  Time to try a new, fifty year old, approach.

]]>
https://techliberation.com/2012/03/22/lightsquared-and-dish-what-would-coase-do/feed/ 2 40445
When an Idea Become a Meme, and Why https://techliberation.com/2012/02/21/when-an-idea-become-a-meme-and-why/ https://techliberation.com/2012/02/21/when-an-idea-become-a-meme-and-why/#comments Wed, 22 Feb 2012 02:28:06 +0000 http://techliberation.com/?p=40195

Ceci c’est un meme.

On Forbes today, I look at the phenomenon of memes in the legal and economic context, using my now notorious “Best Buy” post as an example. Along the way, I talk antitrust, copyright, trademark, network effects, Robert Metcalfe and Ronald Coase.

It’s now been a month and a half since I wrote that electronics retailer Best Buy was going out of business…gradually.  The post, a preview of an article and future book that I’ve been researching on-and-off for the last year, continues to have a life of its own.

Commentary about the post has appeared in online and offline publications, including The Financial Times, The Wall Street Journal, The New York Times, TechCrunch, Slashdot, MetaFilter, Reddit, The Huffington Post, The Motley Fool, and CNN. Some of these articles generated hundreds of user comments, in addition to those that appeared here at Forbes.

(I was also interviewed by a variety of news sources, including TechCrunch’s Andrew Keen.)

http://player.ooyala.com/player.js?embedCode=MwYXBlMzr31OJSkeNk7KuJIWbEHYHmXj&deepLinkEmbedCode=MwYXBlMzr31OJSkeNk7KuJIWbEHYHmXj&width=600&height=360

Today, the original post hit another milestone, passing 2.9 million page views.

Watching the article move through the Internet, I’ve gotten a first-hand lesson in how network effects can generate real value.

Network effects are an economic principle that suggests certain goods and services experience increasing returns to scale.  That means the more users a particular product or service has, the more valuable the product becomes and the more rapidly its overall value increases.  A barrel of oil, like many commodity goods, does not experience network effects – only one person can own it at a time, and once it’s been burned, it’s gone.

In sharp contrast, the value of networked goods increase in value as they are consumed.  Indeed, the more they are used, the faster the increase–generating a kind of momentum or gravitational pull.  As Robert Metcalfe, founder of 3Com and co-inventor of Ethernet explained it, the value of a network can be plotted as the square of the number of connected users or devices—a curve that approaches infinity until most everything that can be connected already is.  George Gilder called that formula “Metcalfe’s Law.”

Since information can be used simultaneously by everyone and never gets used up, nearly all information products can be the beneficiaries of network effects.  Standards are the obvious example.  TCP/IP, the basic protocol that governs interactions between computers connected to the Internet, started out humbly as an information exchange standard for government and research university users.  But in part because it was non-proprietary and therefore free for anyone to use without permission or licensing fees, it spread from public to private sector users, slowly at first but over time at accelerating rates.

Gradually, then suddenly, TCP/IP became, in effect, a least common denominator standard by which otherwise incompatible systems could share information.  As momentum grew, TCP/IP and related protocols overtook and replaced better-marketed and more robust standards, including IBM’s SNA and DEC’s DECnet.  These proprietary standards, artificially limited to the devices of a particular manufacturer, couldn’t spread as quickly or as smoothly as TCP/IP.

From computing applications, Internet standards spread even faster, taking over switched telephone networks (Voice over IP), television (over-the-top services such as YouTube and Hulu), radio (Pandora, Spotify)—you name it.

Today the TCP/IP family of protocols, still free-of-charge, is the de facto global standard for information exchange, the lynchpin of the Internet revolution.  The standards continue to improve, thanks to the largely-voluntary efforts of The Internet Society and its virtual engineering task forces.  They’re the best example I know of network effects in action, and they’ve created both a platform and a blueprint for other networked goods that make use of the standards.

Beyond standards, network effects are natural features of other information products including software.  Since the marginal cost of a copy is low (essentially free in the post-media days of Web-based distribution and cloud services), establishing market share can happen at relatively low cost.  Once a piece of software—Microsoft Windows, AOL instant messenger in the old days, Facebook and Twitter more recently—starts ramping up the curve, it gains considerable momentum, which may be all it takes to beat out a rival or displace an older leader.  At saturation, a software product becomes, in essence, the standard.

From a legal standpoint, unfortunately, market saturation begins to resemble an illegal monopoly, especially when viewed through the lens of industrial age ideas about markets and competition.  (That, of course, is the lens that even 21 st century regulators still use.)  But what legal academics, notably Columbia’s Tim Wu, misunderstand about this phenomenon is that such products have a relatively short life-cycle of dominating.  These “information empires,” as Wu calls them, are short-lived, but not, as Wu argues, because regulators cut them down.

Even without government intervention, information products are replaced at accelerating speeds by new disruptors relying on new (or greatly improved) technologies, themselves the beneficiaries of network effects.  The actual need for legal intervention is rare.  Panicked interference with the natural cycle, on the other hand, results in unintended consequences that damage emerging markets rather than correcting them.  Distracted by lingering antitrust battles at home and abroad, Microsoft lost momentum in the last decade.  No consumer benefited from that “remedy.”

For more, see “What Makes an Idea a Meme?” on Forbes.

 

]]>
https://techliberation.com/2012/02/21/when-an-idea-become-a-meme-and-why/feed/ 1 40195
After Action Report on SOPA: Disrupting Advocacy https://techliberation.com/2012/01/26/after-action-report-on-sopa-disrupting-advocacy/ https://techliberation.com/2012/01/26/after-action-report-on-sopa-disrupting-advocacy/#comments Thu, 26 Jan 2012 17:42:02 +0000 http://techliberation.com/?p=40000

On Forbes yesterday, I posted a detailed analysis of the successful (so far) fight to block quick passage of the Protect-IP Act (PIPA) and the Stop Online Piracy Act (SOPA). (See “Who Really Stopped SOPA, and Why?“)  I’m delighted that the article, despite its length, has gotten such positive response.

As regular readers know, I’ve been following these bills closely from the beginning, and made several trips to Capitol Hill to urge lawmakers to think more carefully about some of the more half-baked provisions.

But beyond traditional advocacy–of which there was a great deal–something remarkable happened in the last several months. A new, self-organizing protest movement emerged on the Internet, using social news and social networking tools including Reddit, Tumblr, Facebook and Twitter to stage virtual teach-ins, sit-ins, boycotts, and other protests.

The article describes the political philosophy and origins of this movement, which I called “bitroots” activism. I warn both fans and detractors about the dangers facing this new global political force as it navigates the delicate transition from single-issue protest to a sustainable voice in shaping technology law and regulation.

But so far, at least, supporters of PIPA and SOPA won’t even acknowledge the existence of this third front, dismissing it as a stunt perpetrated by a few large technology companies.  That response not only misses the point, but makes clear the need for new forms of political dialogue over technology issues in the fist place.

As someone who spends time both in Silicon Valley and inside the Beltway, I’ve long been concerned about the lack of informed conversations between innovators and regulators, especially as the two come increasingly into conflict as their worlds move closer together. (That was the central theme of The Laws of Disruption, now available practically for free on Amazon!)

Now that the bitroots movement has coalesced, I can’t wait to see where it goes next. I have high hopes for this new awareness and activism, and for their intuitive understanding that the innovations that enable them are their best weapons for changing the political dialogue. Who knows? They may even wind up disrupting traditional forms of advocacy.

]]>
https://techliberation.com/2012/01/26/after-action-report-on-sopa-disrupting-advocacy/feed/ 1 40000
On Incentive Auctions, the FCC Reaps what it Sowed https://techliberation.com/2012/01/16/on-incentive-auctions-the-fcc-reaps-what-it-sowed/ https://techliberation.com/2012/01/16/on-incentive-auctions-the-fcc-reaps-what-it-sowed/#comments Tue, 17 Jan 2012 01:35:16 +0000 http://techliberation.com/?p=39803

After three years of politicking, it now looks like Congress may actually give the FCC authority to conduct incentive auctions for mobile spectrum, and soon.  That, at least, is what the FCC seems to think.

At CES last week, FCC Chairman Julius Genachowski largely repeated the speech he has now given three years in a row.  But there was a subtle twist this time, one echoed by comments from Wireless Bureau Chief Rick Kaplan at a separate panel.

Instead of simply warning of a spectrum crunch and touting the benefits of the incentive auction idea, the Chairman took aim at a House Republican bill that would authorize the auctions but limit the agency’s “flexibility” in designing and conducting them. “My message on incentive auctions today is simple,” he said, “we need to get it done now, and we need to get it done right.”

By “done right,” Genachowski means without meaningful limits on how the agency constructs or oversees the auctions.  The Chairman’s attitude now seems to be if the FCC can’t have complete freedom, it would rather not have incentive auctions at all.  That’s a strange stance given the energy the FCC has expended making the case that such auctions are critical for mobile broadband users.

What’s the fight about?  The House bill would prohibit the agency from introducing bidder qualifications based on external factors, such as current spectrum holdings.  The FCC could not, in other words, directly or indirectly exclude carriers who already have significant spectrum licenses.  The agency would also be limited in its ability to attach special conditions to new licenses issued as part of particular auctions.  An amendment by Rep. Marsha Blackburn (R-Tenn.) that was approved last month would specifically forbid special net neutrality conditions.

This may sound like an inside-the-beltway spat, but the stakes are in fact quite high, going right to the core of what role the FCC should play in 21st century communications.  For the Chairman, these limits rise to the level of an existential crisis, casting doubt on the agency’s very nature as an expert regulator.  Congress should, he argued, authorize the auctions and let the agency’s staff of legal, economic and technical experts decide how best to organize them.  Tying the FCC’s hands by statute, he said, is “a mistake”:

because it preempts an expert agency process that’s fact-based, data-driven and informed by a broad range of economists, technologists and stakeholders on an open record. The proposals on the table to restrict the FCC’s flexibility in its area of technical expertise would be a significant departure from precedent.

 Spectrum- and auction-related issues pose hard questions.  I believe they should be answered based on the evidence, on an open record, as close as possible to the time when they need to be made.

House leaders see it very differently.  They see an agency that badly bungled the recent 700 Mhz. auctions—the last major auctions the FCC has conducted.   As a pre-condition to bidding, for example, Google demanded “open access” conditions, which the FCC belatedly agreed to add.  Instead of answering “hard” questions based on “facts” and “data” in an open record, the agency simply gave in to pressure from a late and well-connected bidder.

There was no expertise applied here.  And the result, as I’ve noted elsewhere, was that bids for the C block (where the open access conditioned were applied) were discounted to the tune of billions of dollars that would otherwise have gone to the Treasury.

Verizon won the auction, but now faces uncertain application of the conditions, which differ materially from the open Internet rules the agency passed last year in the net neutrality rulemaking.  Meanwhile, the mobile marketplace is a very different place than it was when Google first stepped in, dominated by device  and operating system providers and proprietary app stores that didn’t even exist in 2008.

Larger bidders, meanwhile, wary of the vaguely-defined new conditions, shifted to the A and B blocks, pushing out smaller carriers.  Precisely the opposite result to what the agency intended in designing the auctions in the first place.

Politically-driven choices on how the D block should be licensed for public safety turned out even worse.  That auction could not find a bidder willing to live with the FCC’s conditions.  The spectrum sits unused, even as public safety still has no interoperable network more than a decade after 9/11.

If that’s what an “expert” agency with does with its “flexibility,” then it’s no wonder House leaders are skeptical.   “Flexibility” should mean maximizing revenues and ensuring that limited and critical spectrum assets are licensed to those who can put them to the best and highest use.  Not trying to stack the deck in favor of some bidders–and still getting it wrong.

Nothing has changed.  The agency still seems determined to use its auction authority to shape mobile broadband competition in its own sclerotic image.  It wants to create a competitive market among carriers even as competition is increasingly driven by other players in the mobile ecosystem.  It wants a return to the failed practice of unbundling to create an abundance of phantom competitors who have no assets and no understanding of communications, created by financial engineers who recognize a good regulatory arbitrage when they see one.

Not so, says the Chairman.  Our view of the market is deeply analytical, the result of thorough technical and economic analysis conducted by the bureaus.  His evidence?  The agency’s annual competition reports.  Or so he told CEA Gary Shapiro following his speech, when asked for proof that the agency understands the markets with which it tinkers.

But the competition reports are hardly models of lucid analysis.  They are constrained by the bureaus’ crabbed view of the market, a view required by the statutory requirements that generate the reports.  They continue to emphasize obsolete proxies for measuring competition, including HHIs and the spectrum screen, even as actual data on market conditions is relegated to the back of the report.  For the last two years, the mobile competition report pointedly refused to say whether the agency thought the market was competitive or not.

Yet the agency deliberately forfeited even the limited value of the competition reports by rejecting out-of-hand the AT&T/T-Mobile USA deal.  Rather than focusing on declining prices for voice, text, and data over the last ten years, or the regulatory constraints that make mergers necessary to expand coverage and service (both amply documented in the reports), the staff report on the T-Mobile deal largely swallowed the simplistic mantra of opponents of the deal that taking out one “national” carrier was per se anti-competitive.  The report’s principal objection seemed to be that any horizontal merger of two companies would result in one fewer competitor.  True, but irrelevant.

There was no sign of expert regulator at work here; nothing to suggest an analysis that was “fact-based, data-driven and informed by a broad range of economists, technologists and stakeholders.”  The analysis started with a conclusion and worked backwards.  And when even the old formulas didn’t come out right, at least in the case of the spectrum screen, the books were simply cooked until they did.

Well, that’s all water under the bridge in 2012.  “This is an incredibly fast-moving space,” the Chairman said of the need for flexibility, “and any policy that pre-judges or predicts the future runs a great risk of unintended and unfortunate consequences.”

That’s a good point.  But it’s also a perfect description of last year’s Net Neutrality rulemaking.  During a year of proceedings, the FCC turned up next to no evidence of an actual problem, let alone a market failure.  Still, the agency stuck doggedly to its first principals, insisting after-the-fact that “prophylactic” rules limiting network management technologies of the future were essential to maintaining a “level playing field.”  Never mind that the playing field showed no signs of imbalance, or that it continued to evolve dramatically (iPhone, iPad, Android and Verizon’s LTE introduction, for starters) as deliberations dragged on in a regulatory vacuum.

One “unintended and unfortunate consequence” of that and similar missteps has already become clear—Congress doesn’t trust the Chairman to follow the law.

Which is, I suspect, the main reason incentive auction authority hasn’t yet passed, even though nearly everyone agrees it’s the best short-term solution to a spectrum crisis of the government’s own making.  And why, when it does come, there are likely to be plenty of strings attached.

Which is too bad.  Because, if the FCC really acted as the expert agency it is chartered to be, Genachowski would be right about the value of flexibility.

 

 

 

]]>
https://techliberation.com/2012/01/16/on-incentive-auctions-the-fcc-reaps-what-it-sowed/feed/ 3 39803
The FCC Goes Steampunk https://techliberation.com/2011/12/13/the-fcc-goes-steampunk/ https://techliberation.com/2011/12/13/the-fcc-goes-steampunk/#comments Wed, 14 Dec 2011 02:13:11 +0000 http://techliberation.com/?p=39357

I’ve written several articles in the last few weeks critical of the dangerously unprincipled turn at the Federal Communications Commission toward a quixotic, political agenda.  But as I reflect more broadly on the agency’s behavior over the last few years, I find something deeper and even more disturbing is at work.  The agency’s unreconstructed view of communications, embedded deep in the Communications Act and codified in every one of hundreds of color changes on the spectrum map, has become dangerously anachronistic.

The FCC is required by law to see separate communications technologies delivering specific kinds of content over incompatible channels requiring distinct bands of protected spectrum.  But that world ceased to exist, and it’s not coming back.  It is as if regulators from the Victorian Age were deciding the future of communications in the 21 st century.  The FCC is moving from rogue to steampunk.

With the unprecedented release of the staff’s draft report on the AT&T/T-Mobile merger, a turning point seems to have been reached.  I wrote on CNET  (see “FCC:  Ready for Reform Yet?”) that the clumsy decision to release the draft report without the Commissioners having reviewed or voted on it, for a deal that had been withdrawn, was at the very least ill-timed, coming in the midst of Congressional debate on reforming the agency.  Pending bills in the House and Senate, for example, are especially critical of how the agency has recently handled its reports, records, and merger reviews.  And each new draft of a spectrum auction bill expresses increased concern about giving the agency “flexibility” to define conditions and terms for the auctions.

The release of the draft report, which edges the independent agency that much closer to doing the unconstitutional bidding not of Congress but the White House, won’t help the agency convince anyone that it can be trusted with any new powers.   Let alone the novel authority to hold voluntary incentive auctions to free up underutilized broadcast spectrum.

What is the Spectrum Screen Really Screening, Anyway?

One particularly disturbing feature of the report was what appears to be a calculated jury-rigging of the spectrum screen, as I wrote in an op-ed for The Hill.  (See “FCC Plays Fast and Loose with the Law…Again”)  For the first time since introducing the test as a way to simplify merger review, the draft report lowers the amount of spectrum it believes available for mobile use, even as technology continues to make more spectrum usable.  The lower total added 82 markets in which the screen would have been triggered, though the staff report in any case never actually performs the analysis of any local market.

The rationale for the adjustment is hidden in a non-public draft of an order on the transfer of Qualcomm’s FLO-TV licenses to AT&T, an order that is only now just circulating among the Commissioners.   Indeed, the Qualcomm order was only circulated a day before the T-Mobile report was released to the public and (in unredacted form) to  the DoJ.

(Keeping draft documents private is the normal course of business at the agency—the T-Mobile report being the rare and disturbing exception of releasing a report before even the Commissioners have reviewed or voted on it, here in obvious hopes of influencing the Justice Department’s antitrust litigation).

In the draft Qualcomm order, according to a footnote in the draft T-Mobile report, agency staff propose a first-time-ever reduction in the total amount of usable spectrum that forms the basis of the screen.  (Under the test, if the total spectrum of the combined entity in a market is less than a third of the usable spectrum, the market is presumed competitive and no analysis is required.)

For purposes of the T-Mobile analysis, the unexplained reduction is assumed to be acceptable to the Commission and applied to calculations of spectrum concentration in each of the local Cellular Market Areas.  (The calculation also assumes AT&T has the pending Qualcomm spectrum.)  Notably, without the reduction the number of local markets in which the screen would be triggered goes down by a third.

Asked in a press conference today about the curious manipulation, FCC Chairman Genachowski refused to comment.

The spectrum screen, by the way, never made much sense.  Its gross oversimplification of total usable spectrum, for one thing, hides a ridiculous assumption that all bands of usable spectrum are equally usable, defying the most basic physics of mobile communications.  With a wink to the apples-and-oranges nature of different bands, since 2004 the agency has decided more or less arbitrarily to increase the total amount of “usable” spectrum by including some new bands of usable spectrum and not others, with little rhyme or reason.

The manipulation of the spectrum screen’s coefficients, in fact, have no rationale other than to fast-track some preferred mergers and create regulatory headaches for others.  In truth, a screen that counted all spectrum actually being used for mobile communications, and counted it equally, would suggest that Sprint, in combination with its subsidiary Clearwire, is the only dangerously monopolistic holder of spectrum assets.  As Chart 38 of the FCC’s 15 th Annual Mobile Competition Report suggests, Sprint and Clearwire hold more “spectrum” than any other carrier—enough to trigger the screen in most if not all CMAs.  That is, if it was all counted.

 

That isn’t necessarily the right outcome either.  Much of Clearwire’s spectrum is in the >1 GHz. Bands, and, at least for now, those bands are usable but not as attractive for mobile communications as other, lower bands.

As the Mobile Competition Report notes, “these different technical characteristics provide relative advantages for the deployment of spectrum in different frequency bands under certain circumstances. For instance, there is general consensus that the more favorable propagation characteristics of lower frequency spectrum allow for better coverage across larger geographic areas and inside buildings, while higher frequency spectrum may be well suited for adding capacity.”

So not all spectrum is equal after all.  What, then, is the point or usefulness of the screen?  And what of this unmentioned judo move in the staff report, which suddenly changed the point of the screen from one that simplified merger review to a conclusive presumption against a finding of “public interest”?  The original point of the screen was to quickly eliminate competitive markets that don’t require detailed analysis.  In the AT&T/T-Mobile staff report, for the first time, it’s used to reject a proposed transaction if too many market (how many is not indicated) are triggered that would require that analysis.

But why continue to compare apples and oranges for any purpose, when the real data on CMA competition is readily available?  The only answer can be that the analysis wouldn’t yield the result that the agency had in mind when it started its review.  For in painstaking detail, the 15 th Mobile Competition report also demonstrates that adoption is up, usage is off the charts, prices for voice, data, and text continue to plummet, investments in infrastructure continue at a dramatic pace despite the economy, and new source of competitive discipline are proliferating, in the form of device manufacturers, mobile O/S providers, app developers, and inter-modal competitors.  For starters.

To conclude that AT&T’s interest in T-Mobile’s spectrum and physical infrastructure—an effort to overcome the failure of the FCC and local regulators to provide alternative spectrum or to allow infrastructure investments to proceed at an even faster pace—isn’t in the public interest requires the staff to ignore every piece of data the same staff, in another part of the space-time contiuum, collected and published.  But so long as HHIs and spectrum concentration are manipulated and relied on to foreclose real analysis, it all makes sense.

 

A Rogue Agency Slips into Steampunk

That is largely the point of Geoff Manne’s detailed critique of the substance of the report posted here at TLF, and of my own ridiculously long post on Forbes.  (See “A Strategic Plan for the FCC.”)

The Forbes piece tries to put the staff report into the context of on-going calls for agency reform that were working their way through Congress even before the release.  In it, I conclude that the real problem for the agency is that even with the significant changes of the 1996 Communications Act, the agency is still operating in a stovepipe model, where different communications technologies (cable, cellular, wire, satellite, “local”) are still regulated separately, with different bureaus and in many cases different regulations.

The model assumes that audio and video programming are different from data communications, offered by different industries using incompatible, single-purpose technologies.  A television is not a phone or a radio or a computer.  Broadcast is only for programming, cellular only for voice, satellites only for industrial use.  Cable is an inconveniently novel form of pay television, and data communications are only for large corporations with mainframe computers.

Those siloed regulations are further fragmented by attaching special regulatory conditions to individual license transfers and individual bands of spectrum as part of auctions. Dozens of unrelated and seemingly random requirements were added to Comcast-NBC Universal, for example.  At the last minute the agency added an eccentric version of the net neutrality rules to the 2008 auction for 700 Mhz. spectrum, but only for the C block.

The agency continues to operate under an anachronistic view that distinct technologies support distinct forms of communications (radio, TV, cable, data).  But the world has shifted dramatically under their feet since 1996.  The convergence of nearly all networks to the Internet’s single, non-proprietary standard of packet-switching, digital networks operating under TCP/IP protocols has been nothing short of a revolution in communications.  But it’s a revolution the agency sat out.  It has no idea what role it ought to play in the post-apocalyptic world; nor has Congress given them one.

As different kinds of communications technologies have all (or nearly all) converged on IP, communications applications have blurred beyond the ability to distinguish them.  Voice communications are now offered over data networks, data is flowing over the wires, TV is everywhere, and mobile devices that were unimaginable in 1996 now do everything.

Quite simply, the mismatch between the agency’s structure and the reality of a single digital, virtual network treating all content as bits regardless of the technology or the source that transports it has left the agency unable to cope or to regulate rationally.  Consider some of the paradoxes the agency has been forced to wrestle with in recent years:

  • Is Voice over IP to be regulated as a traditional voice service, with barnacled requirements for Universal Service contribution and 911 services applied and, if so, applied how?
  • Is TV on the Internet, delivered using any and every possible technology including wireless, fiber, copper, and cable, subject to the same Victorian standards of decency as broadcast TV, itself now entirely digital?
  • Is the public interest served when mobile providers combine spectrum and infrastructure assets, largely to overcome the agency’s own paralysis in moving the deeply fractured spectrum map into even the 20th century and the incompetent and corrupt local zoning agencies that hold up applications for new towers and antennae until the proper tribute is rendered?

In the face of these paradoxes, the FCC has become ungrounded; a victim of its own governing statute, which in many respects requires it to remain anachronistic.  Left without clear guidance from Congress on how or whether to regulate what applications (that’s really all we have now—applications, independent of technology), the agency increasingly improvises.

It’s like the wonderful genre of animation known as “steampunk,” where modern technology is projected anachronistically into the past, exploring what life would have been like if the 19 th century had robots, flight, information processing, and modern armaments, all powered by the steam engine.  (The concept of steam punk has now become a popular design genre, including some functioning devices wrapped in steampunk elements, as in the photo below.)

A Steampunk Computer

It’s cute on film, but applied to the real world it’s simply dangerous.  The FCC is required by law to keep its head in the sand with respect both to the realities of digital technology and the economics of the modern communications ecosystem.  Yet its natural desire to regulate something leaves the Commission flailing wildly in the dark for a foothold for its ancient regulatory structure in a world it doesn’t inhabit.

The Open Internet Notice of Proposed Rulemaking, for example, asked helplessly in over 80 separate paragraphs for education and update on the nature of the revolution spurred by the deployment of broadband Internet. (“We seek more detailed comment on the technological capabilities available today, as offered for sale and as actually deployed in providers’ networks.”)  Of course it had to ask these questions – the agency never regulated broadband.  Under the 1996 Act, as the 2005 Brand X case emphasizes, it never could.

Consider just a few of the absurd counterfactuals that the agency’s steampunk policies have led it in just the last few years (more examples greatly appreciated, by the way):

  • Broadband isn’t being deployed  in a “reasonable and timely fashion” (2011 Section 706 Broadband Report)
  • The mobile communications market is not “effectively competitive” (14th and 15th Mobile Competition Report)
  • High concentrations of customers and spectrum, calculated using rigged HHIs and spectrum screens, are sufficient to raise presumptive antitrust concerns regardless of actual competitive and consumer welfare (AT&T/T-Mobile draft memo)
  • Spectrum suitable for mobile use is decreasing (AT&T/Qualcomm memo)
  • Despite a lack of any examples, broadband providers  “potentially face at least three types of incentives to reduce the current openness of the Internet” (Open Internet order)
  • Encouraging competition and protecting consumer choice “cannot be achieved by preventing only those practices that are demonstrably anticompetitive or harmful to consumers.” (Open Internet order)
  • The agency” expect[s] the costs of compliance with our prophylactic rules to be small”  (Open Internet order)
  • Absent a mandatory data roaming regime for mobile broadband, “there will be a significant risk that fewer consumers would have nationwide access to competitive mobile broadband services….”  (Data Roaming order).

Not that there isn’t considerable expertise within the agency, and glimmers of understanding that manage to escape in whiffs from the steam pipes.  The 2010 National Broadband Plan, developed with a great deal of both internal and external agency expertise, does an admirable job of describing the current state of the broadband environment in the U.S.  More impressive, the later chapters predict with considerable vision the application areas that will drive the next decade of broadband deployment and use, including education, employment, health care and the smart grid.

The NBP, unfortunately, is the exception.  More and more of the agency’s reports, orders, and decisions instead bury the expertise, forcing ridiculous conclusions through an implausible lens of nostalgia and distortion.  The agency’s statutorily mandated hold on a never-realistic glorious communications past is increasingly threatening the health of the real communications ecosystem–an even more glorious (largely because unregulated) communications present.

 

I Love it When a Plan Comes Together

The FCC’s steampunk mentality is threatening to wreak havoc on the natural evolution of the Internet revolution.  It’s also turning the FCC from a respected and Constitutionally-required “independent” agency that answers to Congress and not the White House into a partisan monster, pursuing an agenda that’s light on facts and heavy on the politics of the administration and favored participants in the Internet ecosystem.  The agency relies on clichés and unexamined mantras rather than data—even its own data.  Mergers are bad, edge providers are good, and the agency doesn’t acknowledge that many of the genuine market failures that do exist are creatures of its own stovepipes.

As I note in the long Forbes piece, there was a simple, elegant way to avoid the steampunk phenomenon –an alternative that would have saved the FCC from increased obsolescence and the rest of us from its increasingly bizarre and disruptive regulatory behavior.   And in came from within the walls of FCC headquarters.

In 1999, in the midst of the first great Web boom, then-chairman William Kennard (a Democratic appointee) had a vision for the future of communications that has proven to be entirely accurate.  Kennard created a short, straightforward “strategic plan” for the agency that emphasized breaking down the silos.  It also took a realistic view of the agency’s need and ability to regulate an IP world, encouraging future Chairmen to get out of the way of a revolution that would provide far more benefit to consumers if left to police itself than with an FCC trying to play constant catch-up.

Kennard also proposed dramatic reform of spectrum policy, recognizing as is now obvious that imprinting the agency’s stovepiped model for communications like a tattoo on the radio waves was unnecessarily limiting the uses and usefulness of mobile technology, creating artificial scarcity and, eventually, a crisis.

In just a few pages of the report, the strategic plan lays out an alternative, including flexible allocations that wouldn’t require FCC permission to change uses, market-based mechanisms to ensure allocations moved easily to better and higher uses (no lingering conditions), even the creation of a spectrum inventory (still waiting).  The plan called for incentive systems for spectrum reallocation, an interoperable public safety network, and expanded use of unlicensed spectrum.  All reforms that we’re still violently agreeing need to be made.

We’ve arrived, unfortunately, at precisely the future Kennard hoped to avoid.  And we’re still moving, at accelerating speeds, in precisely the wrong direction.  Instead of working to ease spectrum restrictions and leave the “ecosystem” (the FCC’s own term) to otherwise police itself, recent NPRMs and NOIs suggest an agency determined to leverage its limited broadband authority into as many aspects of the converged world as possible.  As the Free State Foundation’s Seth Cooper recently wrote,  today’s FCC has developed a “proclivity to import legacy regulations into today’s IP world when doing so makes little or no sense.”

Fun’s fun.  I like my steampunk as well as anybody.  But I’d prefer to see it on a mobile broadband device, or over Netflix streamed through my IP-enabled television or game console.  Or anywhere else other than at the FCC.

]]>
https://techliberation.com/2011/12/13/the-fcc-goes-steampunk/feed/ 4 39357
Stop the Stop Online Piracy Act! https://techliberation.com/2011/11/01/stop-the-stop-online-piracy-act/ https://techliberation.com/2011/11/01/stop-the-stop-online-piracy-act/#comments Tue, 01 Nov 2011 17:31:55 +0000 http://techliberation.com/?p=38900

For CNET today, I have a long analysis and commentary on the “Stop Online Piracy Act,” introduced last week in the House. The bill is advertised as the House’s version of the Senate’s Protect-IP Act, which was voted out of Committee in May.

It’s very hard to find much positive to say about the House version. While there’s considerable evidence its drafters heard the criticisms of engineers, legal academics, entrepreneurs and venture capitalists, their response was unfortunate.

Engineers pointed out, for example, that court orders requiring individual ISPs to remove or redirect domain name requests was a futile and dangerous way to block access to “rogue” websites. Truly rogue sites can easily relocate to another domain, or simply have users access them with their IP address and bypass DNS altogether.

There are millions of DNS servers, according to Verisign, so getting all of them to make the change would be impossible, splintering the system. And redirecting DNS requests is some sense introducing a bug in the system, one that is inconsistent with upcoming security measures aimed at protecting users from being hijacked.

But all the drafters of SOPA seemed to have heard was the part about “futile.” Their response has been to make the DNS provisions vaguer and more open-ended, in hopes that whatever mechanisms the rogue sites come up with to evade the law will also be illegal.  Blocking is now extended not just to “parasite” sites but to a “portion thereof,” for example.

And the Attorney General can now apply for injunctive relief against any “entity” that provides “a product or service designed or marketed for the circumvention or bypassing of measures” taken in response to an earlier court order.

Similar efforts are found throughout SOPA, particularly in the felony streaming provision, and the private right of action (or what the bill calls the “market-based system”) for private enforcement of copyright and trademark abuses.  Where clarity isn’t possible, the drafters have opted for vagueness, open-ended definitions, and hedges.  Even the term “including” is defined, to be clear that it means “including but not limited to.”

The point to criticism of Protect-IP was instead that it was impossible to regulate technology that is changing so quickly, and that any effort to do so would only prove obsolete on arrival.  As previous efforts from CAN-SPAM to ECPA and back make clear, you cannot future-proof legislation aimed at specfiic features of emerging technologies.

That, unfortunately, is exactly what SOPA tries to do.  And beyond making the legislation clumsy and imprecise, the intentional vagueness greatly increases the potential for unintended consequences.  I describe several unintentionally dangerous examples from SOPA in the CNET piece; other analysts have done the same in pieces listed at the end of this post.

Two good things I found in the 79-page draft:

1.  The failure of Protect-IP to define “nonauthoritative domain name server” has been addressed.  That term is now defined, and the definition looks correct to me.

2.  SOPA recognizes, at least, the better approach to solving the problem of foreign websites that blatantly violate copyright and trademark.  Near the back, Section 205 calls on the State and Commerce Departments to make enforcement of existing international law and treaties regarding information products and services a priority.  This includes the assignment of new attaches dedicated to information products.

Would that SOPA started and ended with this provision, there would be little basis to fault its drafters.  If the problem SOPA is attempting to solve, after all, is the scourge or foreign websites that distribute movies, music, and counterfeit goods without a license (often pretending to be legitimate), then surely the solution is one of foreign and trade policy and not micromanaging Internet protocols.

Instead, we have a bill that treats all U.S. consumers as guilty until proven innocent, and hands Hollywood the keys to the inner workings of the Internet.  Just what they’ve always wanted.

 

Worth reading:

 

]]>
https://techliberation.com/2011/11/01/stop-the-stop-online-piracy-act/feed/ 4 38900
Net Neutrality goes to Court…Again https://techliberation.com/2011/10/04/net-neutrality-goes-to-court-again/ https://techliberation.com/2011/10/04/net-neutrality-goes-to-court-again/#comments Tue, 04 Oct 2011 15:52:56 +0000 http://techliberation.com/?p=38525

On NPR’s Marketplace this morning, I talk about net neutrality litigation with host John Moe.

Nearly a year after the FCC passed controversial new “Open Internet” rules by a 3-2 vote, the White House finally gave approval for the rules to be published last week, unleashing lawsuits from both supporters and detractors.

The supporters don’t have any hope or expectation of getting a court to make the rules more comprehensive.  So why sue?  When lawsuits challenging federal regulations are filed in multiple appellate courts, a lottery determines which court hears a consolidated appeal.

So lawsuits by net neutrality supporters are a procedural gimmick, an effort to take cases challenging the FCC’s authority out of the D.C. Circuit Court of Appeals, which has already made clear the FCC has no legal basis here.

But Verizon’s lawsuit challenges the rules as material changes to existing licenses for spectrum, a challenge that is exclusive to the D.C. Circuit.  If the D.C. Circuit agrees that the rules can be challenged under that provision of the law, then the case stays in D.C..

Beyond the procedure, the substance of Verizon’s challenge will be formidable.  In the 2010 Comcast case, the court eviscerated the FCC’s argument that various provisions of the Communications Act give them the authority to police broadband providers.

And the Open Internet order largely repeated those arguments, a sure sign that the agency really doesn’t expect to win here.  The December vote was largely symbolic, fulfilling an Obama campaign promise to codify net neutrality and moving the noisy and messy proceeding from the agency to the courts.

The real issue here is convergence.  In 1996, when the Communications Act was last overhauled, the commercial use of the Internet was in its infancy.  Broadcast TV, radio, telephone, cable, mobile and data services were still separate technologies, and the 1996 Act gave the FCC separate and different authority over each.  For the Internet, the agency got next to no authority.

In the 15 years since President Clinton signed the 1996 act, of course, the world of communications has been revolutionized by the Internet and broadband.  The FCC’s traditional regulatory subjects have largely converged onto the TCP/IP protocol, generating a flowering of innovation and new devices and services.  Cable providers are phone companies, phone companies are content providers, and computer companies such as Apple and Google are, well, everything.

Consumers are living in a golden age of communications, but the agency has been left with little to oversee.  Wireline voice has become an unprofitable and shrinking business as consumers cut the cord.  The audience for broadcast TV is getting older and smaller at a rapid pace.  This term, the Supreme Court is likely to slap the agency again for its Victorian sensibilities with regard to TV and radio content censorship.

Perhaps Congress will someday decide that broadband services require the kind of oversight and micromanagement the FCC once had over traditional forms of communication.  Then again, wiser heads may take note of the success of the Internet in a world without much regulation, and decide to leave well enough alone.  Perhaps a great overhaul of communications law will clear the decks altogether, creating a single body of law for the converged industries.

Who can say?  But in the meantime, the FCC can’t simply grant itself new authority to regulate.  Regardless of the sincerity of its belief that “prophylactic” rules to preserve the Open Internet are important, federal agencies can’t regulate without Congressional authorization.  Whether in the courts or in Congress itself, the net neutrality rules will be struck down, first and foremost because the FCC had no power to enact them.

]]>
https://techliberation.com/2011/10/04/net-neutrality-goes-to-court-again/feed/ 1 38525
Net neutrality: Doing the Numbers https://techliberation.com/2011/09/26/net-neutrality-doing-the-numbers/ https://techliberation.com/2011/09/26/net-neutrality-doing-the-numbers/#comments Mon, 26 Sep 2011 19:18:14 +0000 http://techliberation.com/?p=38426

For Forbes this morning, I reflect on the publication late last week of the FCC’s “Open Internet” or net neutrality rules and their impact on spectrum auctions past and future.  Hint:  not good.

An important study last year by Prof. Faulhaber and Prof. Farber, former chief economist and chief technologist, respectively, for the FCC, found that the last-minute imposition of net neutrality limits on the 700 MHz “C” block in the FCC’s 2008 auction reduced the winning bid by 60%–a few billion dollars for the Treasury.

Yet the FCC maintained in the December Report and Order approving similar rules for all broadband providers that the cost impact of these “prophylactic” rules would be minimal, because, after all, they simply endorse practices most providers already follow.  (And the need for the new rules, then, came from where?)

In response to oral and written questions directed at the agency by Congress over the course of the last ten months (while the White House mysteriously held up publication of the new rules), the agency maintained with a straight face that a detailed cost-benefit analysis of the new rules was part of the rulemaking.  But the Chairman seems unable to identify a single paragraph in the majority’s 200-page report where that analysis can be found.

Well, but perhaps bidders in the 2008 auction misjudged the potential negative impact of the new rules on their ability to best utilize the C block.  Perhaps a 60% reduction in bid price was an overreaction to the neutrality limits.  Perhaps, but not likely.  Already, Verizon, which won the auction and is using the spectrum for its state-of-the-art 4G LTE service, has been hit with a truly frivolous complaint from Free Press regarding Google’s refusal to allow software that tethers Android phones to other devices to share the network connection.

And there were rumblings earlier this year in WIRED that curated app stores would also violate the “no blocking” provision in the C block auction (provisions, recall, that were added at the request of Google as a condition of their participating in the auction).  If that were true, then Verizon could never offer an iPhone on the LTE network.  A definite and pointless limit to the value of the C block…for consumers most of all.

These seem like complaints unlikely to go anywhere, but then again who knows?  Even prevailing in FCC adjudications takes time, money, and uncertainty.  Investors don’t like that.  And the new net neutrality rules make complaining even easier, as I noted earlier this year.

So the impact of the net neutrality rules, should they survive Congressional and legal challenges, will be to reduce incentives for broadband carriers to continue investing in their networks.  It won’t stop them, obviously.  But it will surely slow them down.  By how much?  Well, as much as 60%, apparently.  And given that the major facilities-based carriers spend around $20 billion a year in network investments, even a few percentage points of uncertainty translate into real losses.

Balanced out by which benefits, exactly?  Oh right–these are “prophylactic” rules.  So the benefits aren’t knowable.  Until the future.  Maybe.

If reduced investment wasn’t a bad enough result, there’s a deeper and more deeply disturbing lesson of last year’s Net Neutrality free-for-all.  The FCC, an “expert agency,” has become increasingly political.  Its experts are being run over by operative inside and outside the agency with an agenda that lives outside the agency’s expertise, trumping traditional independent values of costs and benefits, and of applying scarce resources to their best and highest use.

That may be one reason Congress has yet to move forward with pending legislation granting the agency authority to conduct Voluntary Incentive Auctions, and why the draft legislation tries to curb the flexibility the agency has if it does get the new authority.

Flexibility, after all, cost the taxpayers a small fortune in the 2008 auction.  And it led to conditions being placed on the license that aren’t helping anyone, and which may keep consumers from getting what all but a few loudmouths genuinely value.

A rulemaking whose goal was to “preserve” the Open Internet may wind up having the opposite result.  The joke, unfortunately, is on mobile users.

 

]]>
https://techliberation.com/2011/09/26/net-neutrality-doing-the-numbers/feed/ 7 38426
iPhone, Android and the Rest: at the Mercy of Local Zoning Boards https://techliberation.com/2011/09/08/iphone-android-and-the-rest-at-the-mercy-of-local-zoning-boards/ https://techliberation.com/2011/09/08/iphone-android-and-the-rest-at-the-mercy-of-local-zoning-boards/#comments Thu, 08 Sep 2011 16:09:55 +0000 http://techliberation.com/?p=38268

For CNET this morning, I have a long article reviewing the sad recent history of how local governments determine the quality of mobile services.

As it  turns out, the correlation is deeply negative.  In places with the highest level of user complaints (San Francisco, Washington, D.C.), it turns out that endless delays or outright denials for applications to add towers and other sites as well as new and upgraded equipment is also high.  Who’d have thought?

Despite a late 2009 ruling by the FCC that put a modest “shot clock” on local governments to approve or deny applications, data from CTIA and PCIA included in recent comments on the FCC’s Broadband Acceleration NOI suggests the clock has had little to no effect.  This is in part because the few courts that have been asked to enforce it have demurred or refused.

Much of the dithering by local zoning boards is unprincipled and pointless, a sign not so much of legitimate concerns over safety and aesthetics but of incompetence, corruption, and the insidious influence of  outside “consultants” whose fees are often levied against the applicant, adding insult to injury.

For example, in El Cerrito, CA, about a mile from my house, officials sat for two years an on application to site a tower disguised as a tree on a Boy Scout camp , then passed a two-year moratorium on any new facilities.  (I know that camp well–it is in the midst of a giant chain of parks that run the ridgeline of the Berkeley Hills, thick with invasive, non-native trees that have an unfortunate tendency to explode during fire season.)   In Berkeley, CA, where I live, even applications to collocate new antennae on existing towers require a full review and hearing.

Other city and county boards simply delay or deny, or introduce bizarre requirements, including that any new equipment must be shown to benefit only residents of the jurisdiction.

The “shot clock” rule also banned a common practice among many communities of denying any application for new equipment if an existing mobile provider already served the area.  Yes, that’s right.  With all the hand-wringing and crocodile tears over mobile competition and the danger of the AT&T/T-Mobile merger, many parts of the U.S. prohibit new competitors from entering.

Some communities are still enforcing that rule, and the few court cases that have interpreted the FCC ruling haven’t always embraced it.

Why does this matter?  There are two principal inputs to a cellular network that determine quality of service for customers:  spectrum and cell sites.  Both are under the thumb of government control and constraint.  (Geoff Manne’s recent rant on spectrum is well worth reviewing.)  Over the last five years, the four major providers have invested billions in new infrastructure, and would have invested more, as the FCC acknowledges, were it not for the interference of local governments.   In 2009 alone, over $20 billion was invested, representing 13% of total industry revenue.

 

Capital Expenditure by Carrier

Source:  Federal Communications Commissison

If service is poor in some parts of the country, we have only ourselves to blame. But as one commentator to my article put it, it’s so much more fun to blame the device or the carrier.

Or, not so funny, to take a “principled” stand on behalf of competition to block a merger designed to evade these increasingly dangerous roadblocks,

]]>
https://techliberation.com/2011/09/08/iphone-android-and-the-rest-at-the-mercy-of-local-zoning-boards/feed/ 2 38268
Why Silicon Valley should fear U.S. v AT&T https://techliberation.com/2011/09/01/why-silicon-valley-should-fear-u-s-v-att/ https://techliberation.com/2011/09/01/why-silicon-valley-should-fear-u-s-v-att/#comments Thu, 01 Sep 2011 07:51:23 +0000 http://techliberation.com/?p=38218

On Forbes this morning, I argue that the Department of Justice’s effort to block the AT&T/T-Mobile merger signals a dangerous turn in antitrust enforcement.

While President Obama promised during his campaign to “reinvigorate” antitrust, few expected the agency would turn its attention with such laser-like precision on the technology sector, one of the few bright spots in the economy.  But as Comcast, Google, Intel, Oracle and now AT&T can testify, the agency seems determined to make its mark on the digital economy.  If only it had the slightest idea how that economy actually worked, and why it works so well.

Silicon Valley should take careful note of the dark turn in the agency’s view of what constitutes competitive harm.  But if experience is any guide, they probably won’t.  The tech community believes that if they ignore Washington, it isn’t really there, and explains away contrary evidence as random catastrophe, as unpredictable as an earthquake in Virginia.

Regardless of how this case resolves itself, that’s increasingly a dangerous attitude for entrepreneurs, venture capitalists, and tech leaders.  It’s morning in Palo Alto.  But is anyone awake?

]]>
https://techliberation.com/2011/09/01/why-silicon-valley-should-fear-u-s-v-att/feed/ 3 38218
A Few Edits to Protect IP https://techliberation.com/2011/08/17/a-few-edits-to-protect-ip/ https://techliberation.com/2011/08/17/a-few-edits-to-protect-ip/#comments Wed, 17 Aug 2011 17:04:30 +0000 http://techliberation.com/?p=38093

For CNET this morning, I offer five crucial corrections to the Protect IP Act, which was passed out of committee in the Senate back in May.

Yesterday, Rep. Bob Goodlatte, co-chair of the Congressional Internet Caucus, told a Silicon Valley audience that the House was working on its own version and would introduce it in the next few weeks.

Protect IP would extend efforts to combat copyright infringement and trademark abuse online, especially by websites registered outside the U.S.

Since Goodlatte promised the new bill would be “quite different” from the Senate version, I thought it a good time to get out my red pen and start crossing off the worst mistakes in policy and in drafting in Protect IP.

The full details are in the article, but in brief, here’s what I hope the House does in its version:

  1. Drop provisions that tamper with the DNS system in an effort to block U.S. access to banned sites.
  2. Drop provisions that tamper with search engines, indices, and any other linkage to banned sites.
  3. Remove a private right of action that would allow copyright and trademark holders to obtain court orders banning ad networks and financial transaction processors from doing business with banned sites.
  4. Scale back current enforcement abuses by the Department of Homeland Security under the existing PRO-IP Act of 2008.
  5. Focus the vague and overinclusive definition of the kind of websites that can be banned, limiting it to truly criminal enterprises.

As I’ve written elsewhere, the Senate version was in some ways even worse than last year’s COICA bill.  It imposes significant costs on innocent Internet users, and would do so with no corresponding benefits to anyone, including rightsholders.

The best thing the House could do would be to ignore this dud and work instead on reforming the broken copyright system.  That would do the most to correct the imbalance in endless copyrights and a shrinking public domain, eliminating much of the incentive for infringement that exists today.

But short of that, I hope at least that the most dangerous provisions are removed.

]]>
https://techliberation.com/2011/08/17/a-few-edits-to-protect-ip/feed/ 3 38093
Net neutrality: the disaster that keeps on giving https://techliberation.com/2011/07/29/net-neutrality-the-disaster-that-keeps-on-giving/ https://techliberation.com/2011/07/29/net-neutrality-the-disaster-that-keeps-on-giving/#comments Fri, 29 Jul 2011 15:46:25 +0000 http://techliberation.com/?p=37933

On CNET this morning, I argue that delay in approving FCC authority for voluntary incentive auctions is largely the fault of last year’s embarrassing net neutrality rulemaking.

While most of the public advocates and many of the industry participants have moved on to other proxy battles (which for most was all net neutrality ever was), Congress has remained steadfast in expressing its great displeasure with the Commission and how it conducted itself for most of 2010.

In the teeth of strong and often bi-partisan opposition, the Commission granted itself new jurisdiction over broadband Internet on Christmas Eve last year.  Understandably, many in Congress are outraged by Chairman Julius Genachowski’s chutzpah.

So now the equation is simple:  while the Open Internet rules remain on the books, Congress is unlikely to give the Chairman any new powers.

House Oversight Committee Chairman Darrell Issa has made the connection explicit, telling reporters in April that incentive auction authority will not come while net neutrality hangs in the air.  There’s plenty of indirect evidence as well.

The linkage came even more sharply into focus as I was writing the article.  On Tuesday, Illinois Senator Mark Kirk offered an amendment to Sen. Reid’s budget proposal, which would have prohibited the FCC from adding neutrality restrictions on VIA auctions.  On Wed., Sen. Dean Heller wrote a second letter to the Chairman, this one signed by several of his colleagues, encouraging the Commission to follow President Obama’s advice and consider the costs and benefits of the Open Internet rules before implementing them.

Yesterday, key House Committee chairmen initiated an investigation into the process of the rulemaking, raising allegations of improper collusion between the White House and the agency, and of a too-cozy relationship between some advocacy groups and members of the Commission.

All this for rules that have yet to take effect, and which face formidable legal challenges.

The Chairman needs a political solution to a problem largely of his own creation.  But up until now, there’s little indication that either the FCC or the White House understand the nature of the challenge.  This year, we’ve had a steady drumbeat of spectrum crisismongering, backed up by logical policy and economic arguments in favor of the VIAs.

While some well-respected economists aren’t convinced VIAs are the best solution to a long history of spectrum mismanagement, for the most part the business case has been made.  But the FCC keeps making it anyway.

Meanwhile, the net neutrality problem isn’t going away.  Mobile users are enjoying their endless wireless Woodstock summer, marching exuberantly toward oblivion, faster and in greater numbers all the time.

Silicon Valley better save us.  Because the FCC, good intentions aside, isn’t even working on the right problem.

 

]]>
https://techliberation.com/2011/07/29/net-neutrality-the-disaster-that-keeps-on-giving/feed/ 7 37933
FCC Mobile Competition Report Is One Green Light for AT&T/T-Mobile Deal https://techliberation.com/2011/07/12/fcc-mobile-competition-report-is-one-green-light-for-attt-mobile-deal/ https://techliberation.com/2011/07/12/fcc-mobile-competition-report-is-one-green-light-for-attt-mobile-deal/#comments Tue, 12 Jul 2011 20:37:26 +0000 http://techliberation.com/?p=37802

By Larry Downes & Geoffrey A. Manne Published in BNA’s  Daily Report for Executives

The FCC published in June its annual report on the state of competition in the mobile services marketplace. Under ordinary circumstances, this 300-plus page tome would sit quietly on the shelf, since, like last year’s report, it ‘‘makes no formal finding as to whether there is, or is not, effective competition in the industry.’’

But these are not ordinary circumstances. Thanks to innovations including new smartphones and tablet computers, application (app) stores and the mania for games such as ‘‘Angry Birds,’’ the mobile industry is perhaps the only sector of the economy where consumer demand is growing explosively.

Meanwhile, the pending merger between AT&T and T-Mobile USA, valued at more than $39 billion, has the potential to accelerate development of the mobile ecosystem. All eyes, including many in Congress, are on the FCC and the Department of Justice. Their review of the deal could take the rest of the year. So the FCC’s refusal to make a definitive finding on the competitive state of the industry has left analysts poring through the report, reading the tea leaves for clues as to how the FCC will evaluate the proposed merger.

Make no mistake: this is some seriously expensive tea. If the deal is rejected, AT&T is reported to have agreed to pay T-Mobile $3 billion in cash for its troubles. Some competitors, notably Sprint, have declared full-scale war, marshaling an army of interest groups and friendly journalists.

But the deal makes good economic sense for consumers. Most important, T-Mobile’s spectrum assets will allow AT&T to roll out a second national 4G LTE (longterm evolution) network to compete with Verizon’s, and expand service to rural customers. (Currently, only 38 percent of rural customers have three or more choices for mobile broadband.)

More to the point, the government has no legal basis for turning down the deal based on its antitrust review. Under the law, the FCC must approve AT&T’s bid to buy T-Mobile USA unless the agency can prove the transaction is not ‘‘in the public interest.’’ While the FCC’s public interest standard is famously undefined, the agency typically balances the benefits of the deal against potential harm to consumers. If the benefits outweigh the harms, the Commission must approve.

The benefits are there, and the harms are few. Though the FCC refuses to acknowledge it explicitly, the report’s impressive detail amply supports what everyone already knows: falling prices, improved quality, dynamic competition and unflagging innovation have led to a golden age of mobile services. Indeed, the three main themes of the report all support AT&T’s contention that competition will thrive and the public’s interests will be well served by combining with T-Mobile.

1. Mobile Service: Rare Bright Spot in Recession

Demand for mobile services is soaring. The FCC reports 274 million mobile subscribers in 2009, up almost 5 percent from the previous year. The number of mobile internet subscribers, the fastest-growing category, doubled between 2008 and 2009. By late 2010, 41 percent of new mobile phone purchases were for smartphones. More than 9 billion apps had been downloaded by the end of 2010.

Despite poor economic conditions elsewhere, new infrastructure investment continues at a frenzied clip. Between 1999 and 2009, industrywide investment exceeded $213 billion. In 2009 alone, investments topped $20 billion—almost 15 percent of total industry revenue. Of the leading providers, only Sprint decreased its investments in recent years.

Yet unlike virtually every other commodity, prices for mobile services continue to decline across the board, hardly a sign of flagging competition. The price of mobile voice services, the FCC reports, has ‘‘declined dramatically over the past 17 years,’’ falling 9 percent from 2008-2009 alone. (The average price for a voice minute is now 4 cents in the U.S., compared with 16 cents in Western Europe.) Text prices fell 25 percent in 2009. The price per megabyte of data traffic fell sevenfold from 2008-2010, from $1.21 to 17 cents.

2. Mobile Competition Is Robust and Dynamic

The FCC, recognizing the dynamism of the mobile services industry, is moving away from simplistic tools the agency once used to evaluate industry competitiveness. The report repeatedly de-emphasizes the Herfindahl-Hirschman Index, or HHI concentration index, which tends to understate competition. The report also downplays the value of ‘‘spectrum screens’’ that once limited a single provider to one-third of the total spectrum in a given market.

Now, the commission says, its evaluation is based on real-world conditions, and looks at competition mostly at the local level. That makes sense. ‘‘Consumers generally search for service providers in the local areas where they live, work, and travel,’’ according to the report, ‘‘and are unlikely to search for providers that do not serve their local areas.’’

Looking at all 172 local markets individually, the FCC found ample evidence of vibrant competition. For mobile voice services, for example, nearly 90 percent of consumers have a choice of five or more providers. In 2010, almost 68 percent of U.S. consumers had four or more mobile broadband providers to choose from, a significant increase over 2009.

Competition between different kinds of wireless service (cellular, PCS, WiFi, and WiMax) is also increasing, and a wider range of the radio spectrum is now being included in the FCC’s analysis. Competition between mobile and traditional wireline service is growing in significance. More and more consumers are even ‘‘cutting the cord:’’ By the beginning of 2010, 25 percent of all households had no wireline service, up from 2 percent in 2003.

And competition within the mobile services marketplace, the Commission recognizes, is increasingly being driven not by the carriers but by new devices, applications and services. From 2008-2009, the FCC found that 38 percent of those who had switched carriers did so because it was the only way to obtain the particular handset that they wanted.

There are dozens of handsets to choose from, and no dominant provider among smartphone operating systems or device manufacturers. New entrants can and do thrive: handsets running Google’s Android operating system rose from 5 percent of the total market at the end of 2009 to almost 20 percent by mid-2010.

3. If There Is a Problem, It Is Government

As consumers continue to embrace new mobile technologies and services, pressure is building on existing networks and the limited radio spectrum available to them. The risk of future network overload is serious—the one dark cloud hanging over the mobile industry’s abundant sunshine. According to the report, ‘‘mobile broadband growth is likely to outpace the ability of technology and network improvements to keep up by an estimated factor of three.’’

The FCC sees a ‘‘spectrum deficit’’ of 300 megahertz within five years. But the FCC and Congress have made little progress over the last two years to free up underutilized spectrum in both public and private hands. Auctions for available spectrum in the valuable 700 Mhz. band are tied up in political fights over a public safety network. Spectrum held by over-the-air television broadcasters is idling as Congress debates ‘‘incentive’’ auctions that would share proceeds between the broadcasters and the government.

Improving coverage by modifying or adding cell towers, the commission finds, is subject to considerable delay at the local level. Of 3,300 zoning applications for wireless facilities pending in 2009, nearly 25 percent had been idling for more than a year. Some had been languishing for more than three years, despite an FCC requirement that applications be decided within 150 days at the most.

Combining the spectrum assets of AT&T and T-Mobile would go a long way toward limiting the potentially catastrophic effect of ‘‘spectrum deficit.’’ AT&T plans to move T-Mobile 3G customers to its existing network and integrate T-Mobile’s existing physical infrastructure, improving 3G service and freeing up valuable spectrum to launch a new nationwide 4G LTE network. As the report notes, T-Mobile had no plans to ever launch true 4G service and, given its limited spectrum holdings, probably never could.

As part of its public interest analysis, the FCC will have to take these and other regulatory constraints to heart.

To Reality . . . and Beyond!

Reading the entire report, it’s clear that the FCC recognizes, as it must, that, even with the exit of T-Mobile from the U.S. market, mobile services would be anything but a ‘‘duopoly’’—either at the national level or at the local level, which is where it counts.

Competition is being driven by multiple local competitors, competing technologies, and handset and software providers. Federal, state and local governments all play an active role in overseeing the industry, which even the FCC now sees as the only serious constraint on future growth.

In Silicon Valley, if not inside the Beltway, consumers are understood to be the real drivers of the mobile services ecosystem—the true market-makers. Maybe that’s why the report found that the vast majority of U.S. consumers report being ‘‘very satisfied’’ with their mobile service.

It is a relief to see the FCC looking carefully at real data and coming to realistic conclusions, as it does throughout the report. Let’s hope reality continues its reign during the long AT&T/T-Mobile review and beyond, as this dynamic industry continues to evolve.

Reproduced with permission from Daily Report for Executives, July 11, 2011. Copyright 2011 The Bureau of National Affairs, Inc. (800-372-1033) www.bna.com.

]]>
https://techliberation.com/2011/07/12/fcc-mobile-competition-report-is-one-green-light-for-attt-mobile-deal/feed/ 622 37802
Brown v EMA and net neutrality? https://techliberation.com/2011/06/28/brown-v-ema-and-net-neutrality/ https://techliberation.com/2011/06/28/brown-v-ema-and-net-neutrality/#comments Tue, 28 Jun 2011 05:10:22 +0000 http://techliberation.com/?p=37539

John Perry Barlow famously said that in cyberspace, the First Amendment is just a local ordinance.  That’s still true, of course, and worth remembering.  But at least today there is good news in the shire.  The local ordinance still applies with full force, if only locally.

As I write in CNET this evening (see “Video Games Given Full First Amendment Protection“), the U.S. Supreme Court issued a strong and clear opinion today nullifying California’s 2005 law prohibiting the sale or rental to minors of what the state deemed “violent video games.”

The 7-2 decision in Brown v. EMA follows last week’s decision in Sorrell, which also addressed the role of the First Amendment in the digital economy.  Sorrell dealt with a Vermont law that banned data mining of pharmacy information.  That application, the Court said, was also protected speech.

The CNET article is quite long (duh), and I’ll let it speak for itself.  There is also excellent commentary on both decisions from Adam Thierer and Berin Szoka here at the Technology Liberation Front.  Adam and Berin submitted an amicus brief in the EMA case that closely tracked the Court’s opinion, which in fact quoted from another amicus brief from the Cato Institute.  Berin also contributed a brief in the Sorrell case, again on the winning side.

Perhaps the most interesting commentary on today’s decision, however, comes from Prof. Susan Crawford.  Prof. Crawford’s blog on EMA notes that an important feature of the majority decision (written by Justice Scalia and joined by Justices Kennedy, Ginsburg, Sotomayor and Kagan) is what she calls the “absolute” view it takes of speech.  Crawford writes of Scalia’s opinion:

“Whether government regulation applies to creating, distributing, or consuming speech makes no difference,” he says in response to Justice Alito’s attempt to say that sale/rental is different from “creation” or “possession” of particular speech.

That view is absolute in the sense that it does not distinguish between different stages of the supply chain of information provisioning.  The “speaker,” for First Amendment purposes, is not only the author of the content, but also distributors, retailers, and consumers.  Each is equally protected by the First Amendment’s prohibition on government interference, whether that interference is a ban on certain content (violent video games) or a requirement to promote it (must-carry rules for cable).

Why does this matter?  Though I have written and tesftified extensively about the FCC’s December, 2010 “Open Internet” order, I have so far avoided discussion of a possible First Amendment challenge.  Frankly, I hadn’t initially thought it to be the strongest available argument against the legality of the rules.

But Prof. Crawford, a strong advocate for “net neutrality” in general, reads EMA as adding support to such an argument:

Today’s opinion may further strengthen the carriers’ arguments that any nondiscrimination requirement imposed on them should be struck down.  Although a nondiscrimination requirement arguably promotes speech rather than proscribes it, the long-ago Turner case on “must-carry” obligations for cable already suggested that the valence of the requirement doesn’t really matter.

If challengers to the Open Internet order (which today added the State of Virginia to the list of those waiting in the wings to file lawsuits) can convince a court that rules requiring nondiscriminatory treatment of packets are effectively requiring carriers to speak, such a rule would be seen as content-based.  Under EMA and last year’s decision in Stevens, such a rule could fail a First Amendment challenge.

It’s an interesting argument, to say the least.  I think I’ll give it a little more thought.

]]>
https://techliberation.com/2011/06/28/brown-v-ema-and-net-neutrality/feed/ 1 37539
Spectrum Reform Now! https://techliberation.com/2011/06/12/spectrum-reform-now/ https://techliberation.com/2011/06/12/spectrum-reform-now/#comments Sun, 12 Jun 2011 23:59:56 +0000 http://techliberation.com/?p=37303

Last week the Senate Commerce Committee passed–with deep bi-partisan support–the Public Safety Spectrum and Wireless Innovation Act.

The bill, co-sponsored by Committee Chairman Jay Rockefeller and Ranking Member Kay Bailey Hutchison, is a comprehensive effort to resolve several long-standing stalemates and impending crises having to do with one of the most critical 21st century resources: radio spectrum.

My analysis of the bill appears today on CNET. See “Spectrum reform, public safety network move forward in Senate.”

The proposed legislation is impressive in scope; it offers new and in some cases novel solutions to more than half-a-dozen spectrum-related problems, including:

  1. Voluntary incentive auctions – The bill authorizes the FCC to coordinate “voluntary incentive auctions” (VIA) of under-utilized spectrum from over-the-air TV broadcasters to better uses, including mobile broadband. Broadcasters giving up some or all of their licensed spectrum would share the proceeds with the government. The FCC has been asking for this authority for two years.

  2. Public safety network – The bill would break the logjam over the long-desired nationwide interoperable public safety network. It would create a new non-profit public-private partnership to build the network, with an outright grant of the D-block of 700 Mhz. spectrum. (That block, freed up as part of the 2009 transition to digital TV, has sat idle since a failed auction in 2008.) Financing for the build-out would come from proceeds of the VIAs. The public safety network has been in limbo since it was first proposed soon after 9/11. (The proposed bill is S. 911.)

  3. Spectrum inventory – The FCC would be required to complete a comprehensive inventory of existing licenses (which, amazingly, doesn’t exist) within 180 days. President Obama ordered the agency to complete the inventory over a year ago, but so far only a “baseline” inventory has been created.

  4. Secondary markets – The FCC would be required to begin a rulemaking to review current limits to secondary spectrum markets that interfere with liquidity, in the hopes of making them more robust. (VIAs could take years to organize and conduct.)

  5. Public spectrum – The National Telecommunications and Information Administration would be required to identify significant blocks of underutilized federal spectrum allocations and make them available for auction by the FCC.

  6. Spectrum innovation – The National Science Foundation and other grant-making agencies would be required to accelerate research grants for new technologies that would make spectrum use more efficient.

  7. Repacking – While the FCC can’t require broadcasters to participate in VIAs, it can force them to move to nearby channels if doing so would free up more valuable blocks of spectrum for auction. A fund would be created to compensate stations for the disruption of switching channels.

The range of issues that S.911 deals with suggests the breadth of the current spectrum crisis. Here it is in a nutshell. Radio frequencies are a limited public resource. Up until recently, however, there’s been more than enough to go around. Following the advice of Nobel prizewinning economist Ronald A. Coase, the FCC has used auctions to find the best and highest use for this resource, generating significant revenue in the process.

But the digital age has changed the dynamics of spectrum. Mobile uses are exploding, as are mobile devices, mobile applications, mobile users and mobile everything else. Moore’s Law is rapidly overtaking FCC law once again. Existing wireless networks are groaning under the strain of volume that has increased 8000% since the launch of the iPhone.

Last year’s National Broadband Plan, for example, predicted that 300 Mhz. of additional spectrum would need to be found in the next five years to keep mobile broadband on track.

But the government’s current processes of finding and allocating more spectrum are simply too slow to keep pace with the current wave of technological innovation. It will get worse as 3G moves to 4G and from there–well, who knows? All we can safely predict is that the “G”s will keep coming, and arrive faster all the time. So radical re-thinking of spectrum management is urgent. We need serious spectrum policy reform, and we need it yesterday.

Part of the solution will come from technology itself, including innovation to make more efficient use of existing allocations, expanding the range of usable spectrum for more uses, capabilities to dynamically share spectrum and rebalance loads, and so on. There are impressive developments in these and other strategies for coping with the potential of spectrum exhaustion, but no one can say with confidence that the solutions will outpace the problems.

The bigger issue underlying spectrum exhaustion is the glacial pace with which current regulatory systems work to rebalance allocations.

Once a license is granted, the licensee can largely rely on keeping it indefinitely. If they operate in a stable or shrinking market (such as over-the-air broadcast, which the Consumer Electronics Association said recently has shrunk to only 8% of U.S. households), there’s no incentive to optimize the property, which, for the licensee, is a sunk cost.

Given the limits of secondary markets, there’s also little incentive to find more efficient uses of the allocation and free up spectrum that is no longer needed for its licensed purpose. Indeed, even for operators who want to exit the market in part or in whole, use limitations on existing allocations make transfer through secondary markets cumbersome if not impossible.

Even if the FCC unblocks these markets, game theory problems may constrain the effectiveness of either the VIAs or the secondary markets.

Federal users, of course, feel no competitive threat to optimize their allocations, and fall back to the conversation-ending “national defense” excuse whenever the possibility emerges of giving up some of the frequencies they are warehousing.

And then there are state and local authorities, who also share jurisdiction over communications. Limits on cell tower construction, use, and other technical improvements aren’t addressed in the proposed legislation. But they are equally to blame for the crisis mentality.

S. 911 is a good start toward removing some of the institutional barriers that limit our flexibility in rebalancing spectrum needs and spectrum allocations. But it’s only a start. If the information revolution is to continue uninterrupted, we need a lot more improvements.

And soon.

]]>
https://techliberation.com/2011/06/12/spectrum-reform-now/feed/ 7 37303