May 2013

” . . . the cooperative process envisioned by the National Broadband Plan is at risk of shifting to the traditionally contentious band plan process that has delayed spectrum auctions in the past.”

The National Broadband Plan proposed a new way to reassign reallocated spectrum. The Plan noted that, “Contentious spectrum proceedings can be time-consuming, sometimes taking many years to resolve, and incurring significant opportunity costs.” It proposed “shifting [this] contentious process to a cooperative one” to “accelerate productive use of encumbered spectrum” by “motivating existing licensees to voluntarily clear spectrum through incentive auctions.” Congress implemented this recommendation through legislation requiring the FCC to transition additional broadcast spectrum to mobile use through a voluntary incentive auction process rather than traditional FCC mandates.

Among other things, the FCC’s Notice of Proposed Rulemaking initiating the broadcast incentive auction proceeding proposed a “lead” band plan approach and several alternative options, including the “down from 51” approach. An overwhelming majority of broadcasters, wireless providers, equipment manufacturers, and consumer groups rejected the “lead” approach and endorsed the alternative “down from 51” approach. This remarkably broad consensus on the basic approach to the band plan promised to meet the goals of the National Broadband Plan by accelerating the proceeding and motivating voluntary participation in the auction.

That promise was broken when the FCC’s Wireless Bureau unilaterally decided to issue a Public Notice seeking additional comment on a variation of the FCC’s “lead” proposal as well as a TDD approach to the band plan. The Bureau issued this notice over the objection of FCC Commissioner Ajit Pai, who issued a separate statement expressing his concern that seeking comment on additional approaches to the band plan when there is a “growing consensus” in favor of the “down from 51” approach could unnecessarily delay the incentive auction. This statement “peeved” Harold Feld, Senior Vice President at Public Knowledge, who declared that there is no consensus and that the “down from 51” plan would be a “disaster.” As a result, the cooperative process envisioned by the National Broadband Plan is at risk of shifting to the traditionally contentious band plan process that has delayed spectrum auctions in the past. Continue reading →

In mid-April, the Federal Trade Commission (FTC) requested comments regarding “the consumer privacy and security issues posed by the growing connectivity of consumer devices, such as cars, appliances, and medical devices” or the so-called “Internet of Things.” This is in anticipation of a November 21 public workshop that the FTC will be hosting on the same issue.

These issues are finally starting to catch the attention of the public and policymakers alike with the rise of wearable computing, remote home automation and monitoring technologies, smart grids, autonomous vehicles and intelligent traffic systems, and so on. The Internet of Things represents the next great wave of Internet innovation, but it also represents the next great battleground in the field of Internet policy.

I filed comments with the FTC today in this proceeding and made a few simple points about why they should proceed cautiously here. A summary of my filing follows. Continue reading →

This week at CTIA 2013, FCC Commissioner Jessica Rosenworcel presented ten ideas for spectrum policy. Though I don’t agree with all of them, she articulated a reasonable vision for spectrum policy that prioritizes consumer demand, incorporates market-oriented solutions, and establishes transparent goals and timelines. Commissioner Rosenworcel’s principled approach stands in stark contrast to the intellectually bankrupt incentive auction recommendation offered by the Department of Justice last month. Continue reading →

My colleague Eli Dourado brought to my attention this XKCD comic and when tweeting it out yesterday he made the comment that “Half of tech policy is dealing with these people”:

The comic and Eli’s comment may be a bit snarky, but something about it rang true to me because while conducting research on the impact of new information technologies on society I often come across books, columns, blog posts, editorials, and tweets that can basically be summed up with the line from that comic: “we should stop to consider the consequences of [this new technology] before we …”  Or, equally common is the line: “we need to have a conversation about [this new technology] before we…”

But what does that really mean? Certainly “having a conversation” about the impact of a new technology on society is important. But what is the nature of that “conversation”? How is it conducted? How do we know when it is going on or when it is over? Continue reading →

Just FYI… The American Enterprise Institute (AEI) is looking for a full-time Program Manager for its new project focused on Internet, communications, and technology policy. The job description can be found online here and is pasted down below:

# # #

The American Enterprise Institute seeks a full-time Program Manager for its new project focused on Internet, communications, and technology policy.

This project will advance policies to encourage innovation, competition, liberty, and growth, creating a positive agenda centered on the political economy of creative destruction. The Program Manager will work closely with the Program Director in the development and day-to-day management of the project; conducting research; developing a new blog website; commissioning monographs and reports; and coordinating events.

Additionally, the Research Program Manager is expected to: Continue reading →

It’s not the culmination–that will come soon–but a major step in work I direct at the Cato Institute to improve government transparency has been achieved. I’ll be announcing and extolling it Wednesday at the House Administration Committee’s Legislative Data and Transparency conference. Here’s a quick survey of what we’ve been doing and the results we see on the near horizon.

After president Obama’s election in 2008, we recognized transparency as a bipartisan and pan-ideological goal at an event entitled: “Just Give Us the Data.” Widespread agreement and cooperation on transparency has held. But by the mid-point of the president’s first term, the deep-running change most people expected was not materializing, and it still has not. So I began working more assiduously on what transparency is and what delivers it.

In “Publication Practices for Transparent Government” (Sept. 2011), I articulated ways the government should deliver information so that it can be absorbed by the public through the intermediary of web sites, apps, information services, and so on. We graded the quality of government data publication in the aptly named November 2012 paper: “Grading the Government’s Data Publication Practices.”

But there’s no sense in sitting around waiting for things to improve. Given the incentives, transparency is something that we will have to force on government. We won’t receive it like a gift.

So with software we acquired and modified for the purpose, we’ve been adding data to the bills in Congress, making it possible to learn automatically more of what they do. The bills published by the Government Printing Office have data about who introduced them and the committees to which they were referred. We are adding data that reflects:

– What agencies and bureaus the bills in Congress affect;

– What laws the bills in Congress effect: by popular name, U.S. Code section, Statutes at Large citation, and more;

– What budget authorities bills include, the amount of this proposed spending, its purpose, and the fiscal year(s).

We are capturing proposed new bureaus and programs, proposed new sections of existing law, and other subtleties in legislation. Our “Deepbills” project is documented at cato.org/resources/data.

This data can tell a more complete story of what is happening in Congress. Given the right Web site, app, or information service, you will be able to tell who proposed to spend your taxpayer dollars and in what amounts. You’ll be able to tell how your member of Congress and senators voted on each one. You might even find out about votes you care about before they happen!

Having introduced ourselves to the community in March, we’re beginning to help disseminate legislative information and data on Wikipedia.

The uses of the data are limited only by the imagination of the people building things with it. The data will make it easier to draw links between campaign contributions and legislative activity, for example. People will be able to automatically monitor ALL the bills that affect laws or agencies they are interested in. The behavior of legislators will be more clear to more people. Knowing what happens in Washington will be less the province of an exclusive club of lobbyists and congressional staff.

In no sense will this work make the government entirely transparent, but by adding data sets to what’s available about government deliberations, management and results, we’re multiplying the stories that the data can tell and beginning to lift the fog that allows Washington, D.C. to work the way it does–or, more accurately, to fail the way it does.

At this point, data curator Molly Bohmer and Cato interns Michelle Newby and Ryan Mosely have marked up 75% of the bills introduced in Congress so far. As we fine-tune our processes, we expect essentially to stay current with Congress, making timely public oversight of government easier.

This is not the culmination of the work. We now require people to build things with the data–the Web sites, apps, and information services that can deliver transparency to your door. I’ll be promoting our work at Wednesday’s conference and in various forums over the coming weeks and months. Watch for government transparency to improve when coders get a hold of the data and build the tools and toys that deliver this information to the public in accessible ways.

Gina Keating, author of Netflixed: The Epic Battle for America’s Eyeballs, discusses the startup of Netflix and their competition with Blockbuster.

Keating begins with the history of the company and their innovative improvements to the movie rental experience. She discusses their use of new technology and marketing strategies in DVD rental, which inspired Blockbuster to adapt to the changing market.

Keating goes on to describe Netflix’s transition to internet streaming and Blockbuster’s attempts to retain their market share.

Download

Related Links

 

 

Tim Lee is right. The Electronic Frontier Foundation post announcing its decision to accept Bitcoin is strange.

“While we are accepting Bitcoin donations,” the post says, “EFF is not endorsing Bitcoin.” (emphasis in original)

They’ve been using dollars over there without anyone inferring that they endorse dollars. They’ve been using various payment systems with no hint of endorsement. And they use all kinds of protocols without disclaiming endorsement—because they don’t need to.

Someone at EFF really doesn’t like Bitcoin. But, oh, how wealthy EFF would be as an institution if they had held on to the Bitcoin they were originally given. I argued at the time it refused Bitcoin that it was making a mistake, not because of the effect on its bottom line, but because it showed timidity in the face of threats to liberty.

Well, just in time for the Bitcoin 2013 conference in San Jose (CA) this weekend, EFF is getting on board. That’s good news, but it’s not as good as the news would have been if EFF had been a stalwart on Bitcoin the entire time. I have high expectations of EFF because it’s one of the great organizations working in the area of digital liberties.

The International Association of Privacy Professionals (IAPP) has been running some terrific guest essays on its Privacy Perspectives blog lately. (I was honored to be asked to submit an essay to the site a few weeks ago about the ongoing Do Not Track debate.) Today, the IAPP has published one of the most interesting essays on the so-called “right to be forgotten” that I have ever read. (Disclosure: We’ve written a lot here about this issue here in the past and have been highly skeptical regarding both the sensibility and practicality of the notion. See my Forbes column, “Erasing Our Past on the Internet,” for a concise critique.)

In her fascinating and important IAPP guest essay, archivist Cherri-Ann Beckles asks, “Will the Right To Be Forgotten Lead to a Society That Was Forgotten?” Beckles, who is Assistant Archivist at the University of the West Indies, powerfully explains the importance of archiving history and warns about the pitfalls of trying to censor history through a “right to be forgotten” regulatory scheme. She notes that archives “protect individuals and society as a whole by ensuring there is evidence of accountability in individual and/or collective actions on a long-term basis. The erasure of such data may have a crippling effect on the advancement of a society as it relates to the knowledge required to move forward.”

She concludes by arguing that:

From the preservation of writings on the great pharaohs to the world’s greatest thinkers and inventors as well as the ordinary man and woman, archivists recognise that without the actions and ideas of people, both individually and collectively, life would be meaningless. Society only benefits from the actions and ideas of people when they are recorded, preserved for posterity and made available. Consequently, the “right to be forgotten” if not properly executed, may lead to “the society that was forgotten.”

Importantly, Beckles also stresses the importance of individual responsibility and taking steps to be cautious about the digital footprints they leave online. “More attention should instead be paid to educating individuals to ensure that the record they create on themselves is one they wish to be left behind,” she notes. “Control of data at the point of creation is far more manageable than trying to control data after records capture.”

Anyway, read the whole essay. It is very much worth your time.

Frontline relied on the DOJ foreclosure theory to predict that the lack of eligibility restrictions in the 700 MHz auction would “inevitably” increase prices, stifle innovation, and reduce the diversity of service offerings as Verizon and AT&T warehoused the spectrum. In reality, the exact opposite occurred.

The DOJ recently recommended that the FCC rig the upcoming incentive auction to ensure Sprint Nextel and T-Mobile are winners and Verizon and AT&T are losers. I previously noted that the DOJ spectrum plan (1) inconsistent with its own findings in recent merger proceedings and the intent of Congress, (2) inherently discriminatory, and (3) irrational as applied. Additional analysis indicates that it isn’t supported by economic theory or FCC factual findings either. Continue reading →