Eli Dourado – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Tue, 10 Feb 2015 16:19:50 +0000 en-US hourly 1 6772528 My State of the Net panel on Bitcoin https://techliberation.com/2015/02/10/my-state-of-the-net-panel-on-bitcoin/ https://techliberation.com/2015/02/10/my-state-of-the-net-panel-on-bitcoin/#comments Tue, 10 Feb 2015 16:19:50 +0000 http://techliberation.com/?p=75436

A couple weeks ago at State of the Net, I was on a panel on Bitcoin moderated by Coin Center’s Jerry Brito. The premise of the panel was that the state of Bitcoin is like the early Internet. Somehow we got policy right in the mid-1990s to allow the Internet to become the global force it is today. How can we reprise this success with Bitcoin today?

In my remarks, I recall making two basic points.

First, in my opening remarks, I argued that on a technical level, the comparison between Bitcoin and the Internet is apt.

What makes the Internet different from the telecommunications media that came before is the separation of an application layer from a transport layer. The transport layer (and the layers below it) does the work of getting bits to where they need to go. This frees anybody up to develop new applications on a permissionless basis, taking this transport capability basically for granted.

Earlier telecom systems did not function this way. The applications were jointly defined with the transport mechanism. Phone calls are defined in the guts of the network, not at the edges.

Like the Internet, Bitcoin separates out not a transport layer, but a fiduciary layer, from the application layer. The blockchain gives applications access to a fiduciary mechanism that they can take basically for granted.

No longer will fiduciary applications (payments, contracts, asset exchange, notary services, voting, etc.) and fiduciary mechanisms need to be developed jointly. Unwieldy fiduciary mechanisms (banks, legal systems, oversight) will be able to be replaced with computer code.

Second, in the panel’s back and forth, particularly with Chip Poncy, I argued that technological change may necessitate a rebalancing of our laws and regulations on financial crimes.

We have payment systems because they improve human welfare. We have laws against certain financial activities because those activities harm human welfare. Ideally, we would balance the gains against the losses to come up with the optimal, human-welfare-maximizing level of regulation.

However, when a new technology like the blockchain comes along, the gains from payment freedom increase. People in a permissionless environment will be able to accomplish more than before. This means that we have to redo our balancing calculus. Because the benefits of unimpeded payments are higher, we need to tolerate more harms from unsavory financial activities if our goal remains to maximize human welfare.

Thanks to my co-panelists for a great discussion.

]]>
https://techliberation.com/2015/02/10/my-state-of-the-net-panel-on-bitcoin/feed/ 3 75436
Wanted: talented, gritty libertarians who are passionate about technology https://techliberation.com/2015/02/09/wanted-talented-gritty-libertarians-who-are-passionate-about-technology/ https://techliberation.com/2015/02/09/wanted-talented-gritty-libertarians-who-are-passionate-about-technology/#respond Mon, 09 Feb 2015 18:07:04 +0000 http://techliberation.com/?p=75423

Ten or fifteen years ago, when I sat around and thought about what I would do with my life, I never considered directing the technology policy program at Mercatus. It’s not exactly a career track you can get on — not like being a lawyer, a doctor, a professor.

One of the things I loved about Peter Thiel’s book Zero to One is that it is self-consciously anti-track. The book is a distillation of Thiel’s 2012 Stanford course on startups. In the preface, he writes,

“My primary goal in teaching the class was to help my students see beyond the tracks laid down by academic specialties to the broader future that is theirs to create.”

I think he is right. The modern economy provides unprecedented opportunity for people with talent and grit and passion to do unique and interesting things with their lives, not just follow an expected path.

This is great news if you are someone with talent and grit and passion. Average is Over . What you have is valuable. You can do amazing things. We want to work with you, invest in you—maybe even hire you—and unleash you upon the world.

The biggest problem we have is finding you.

There is no technology policy career track, nor would we want there to be one. Frankly, we don’t want someone who needs the comfort and safety of a future that someone else designed for him.

Unfortunately, this also means that there is no defined pool of talented, gritty libertarians who are passionate about technology for Mercatus or our tech policy allies to hire from.

So how are we supposed to find you? We need your help. You need to do two things.

First, get started now.

Just start doing technology policy.

Write about it every day. Say unexpected things; don’t just take a familiar side in a drawn-out debate. Do something new. What is going to be the big tech policy issue two years from now? Write about that. Let your passion show.

The tech policy world is small enough — and new ideas rare enough — that doing this will get you a following in our community.

It also sends a very strong signal come interview time. Anybody can say that they are talented, or gritty, or passionate. You’ll be able to show it.

I literally got hired because of a blog post. There were other helpful inputs, of course — credentials, references, some contract work that turned out well. But what initially got me on Mercatus’s radar screen was a single post.

https://twitter.com/jerrybrito/status/77063232086491136

Second, get in touch.

Everyone on the Mercatus tech policy team is highly Googleable (on Twitter, here’s me, Adam, Brent, and Andrea). We want to know who you are, what you are doing, and what your plans are.

There is almost no downside to this.

Best case scenario: we create a position for you. No one on our team was hired to fill a vacancy. Instead, we hire people because it’s too good of an opportunity for us to pass up.

Alternatively, maybe we’ll pay you to write a paper or a book.

If for some reason you’re not a great fit for Mercatus, we can connect you with allied groups in tech policy. My discussions with people running other tech policy programs confirms that finding talent is an ever-present problem for them, too.

And at a minimum, we’ll know who you are when we see your work online.

We are serious about winning the battle of ideas over technology, but we can’t do it alone. As technology policy eats the world, the opportunities in our field are going to grow. Let us know if you want to get in on this.

]]>
https://techliberation.com/2015/02/09/wanted-talented-gritty-libertarians-who-are-passionate-about-technology/feed/ 0 75423
New FCC rules will kick at least 4.7 million households offline https://techliberation.com/2015/02/03/new-fcc-rules-will-kick-at-least-6-million-households-offline/ https://techliberation.com/2015/02/03/new-fcc-rules-will-kick-at-least-6-million-households-offline/#respond Tue, 03 Feb 2015 18:11:09 +0000 http://techliberation.com/?p=75386

This month, the FCC is set to issue an order that will reclassify broadband under Title II of the Communications Act. As a result of this reclassification, broadband will suddenly become subject to numerous federal and local taxes and fees.

How much will these new taxes reduce broadband subscribership? Nobody knows for sure, but using the existing economic literature we can come up with a back-of-the-envelope calculation.

According to a policy brief by Brookings’s Bob Litan and the Progressive Policy Institute’s Hal Singer, reclassification under Title II will increase fixed broadband costs on average by $67 per year due to both federal and local taxes. With pre-Title II costs of broadband at $537 per year, this represents a 12.4 percent increase.

[I have updated these estimates at the end of this post.]

How much will this 12.4 percent increase in broadband costs reduce the number of broadband subscriptions demanded? For that, we must turn to the literature on the elasticity of demand for broadband.

As is often the case, the literature on this subject does not give one clear answer. For example, Austan Goolsbee, who was chairman of President Obama’s Council of Economic Advisors in 2010 and 2011, estimated in 2006 that broadband elasticity ranged from -2.15 to -3.76, with an average of around -2.75.

A 2014 study by two FCC economists and their coauthors estimates the elasticity of demand for marginal non-subscribers. That is, they use survey data of people who are not currently broadband subscribers, exclude the 2/3 of respondents who say they would not buy broadband at any price, and estimate their demand elasticity at -0.62.

Since the literature doesn’t settle the matter, let’s pick the more conservative number and use it as a lower bound.

With 84 million fixed broadband subscribers facing a 12.4 percent increase in prices, with an elasticity of -0.62, there will be a 7.7 percent reduction in broadband subscribers, or a decline of 6.45 million households.

Obviously, this is a terrible result.

A question for my friends in the tech policy world who support reclassification: How many households do you think will lose broadband access due to new taxes and fees? Please show your work.

UPDATE: Looks like I missed this updated post from Singer and Litan, which notes that due to the extension of the Internet Tax Freedom Act, the total amount of new taxes from reclassification will be only about $49/year, not $67/year as stated above.

This represents a 9.1 percent increase in costs, so the number of households with broadband will decline by only 5.6 percent, or 4.7 million.

While I regret the oversight, this is still a very high number that deserves attention.

]]>
https://techliberation.com/2015/02/03/new-fcc-rules-will-kick-at-least-6-million-households-offline/feed/ 0 75386
Money for graduate students who love liberty https://techliberation.com/2015/02/02/money-for-graduate-students-who-love-liberty/ https://techliberation.com/2015/02/02/money-for-graduate-students-who-love-liberty/#comments Mon, 02 Feb 2015 15:45:05 +0000 http://techliberation.com/?p=75378

My employer, the Mercatus Center, provides ridiculously generous funding (up to $40,000/year) for graduate students. There are several opportunities depending on your goals, but I encourage people interested in technology policy to particularly consider the MA Fellowship, as that can come with an opportunity to work with the tech policy team here at Mercatus. Mind the deadlines!

The PhD Fellowship is a three-year, competitive, full-time fellowship program for students who are pursuing a doctoral degree in economics at George Mason University. Our PhD Fellows take courses in market process economics, public choice, and institutional analysis and work on projects that use these lenses to understand global prosperity and the dynamics of social change. Successful PhD Fellows have secured tenure track positions at colleges and universities throughout the US and Europe.
It includes full tuition support, a stipend, and experience as a research assistant working closely with Mercatus-affiliated Mason faculty. It is a total award of up to $120,000 over three years. Acceptance into the fellowship program is dependent on acceptance into the PhD program in economics at George Mason University. The deadline for applications is February 1, 2015.

The Adam Smith Fellowship is a one-year, competitive fellowship for graduate students attending PhD programs at any university, in a variety of fields, including economics, philosophy, political science, and sociology. The aim of this fellowship is to introduce students to key thinkers in political economy that they might not otherwise encounter in their graduate studies. Smith Fellows receive a stipend and spend three weekends during the academic year and one week during the summer participating in workshops and seminars on the Austrian, Virginia, and Bloomington schools of political economy.
It includes a quarterly stipend and travel and lodging to attend colloquia hosted by the Mercatus Center. It is a total award of up to $10,000 for the year. Acceptance into the fellowship program is dependent on acceptance into a PhD program at an accredited university. The deadline for applications is March 15, 2015.
The MA Fellowship is a two-year, competitive, full-time fellowship program for students pursuing a master’s degree in economics at George Mason University who are interested in gaining advanced training in applied economics in preparation for a career in public policy. Successful fellows have secured public policy positions as Presidential Management Fellows, economists and analysts with federal and state governments, and policy analysts at prominent research institutions.
It includes full tuition support, a stipend, and practical experience as a research assistant working with Mercatus scholars. It is a total award of up to $80,000 over two years. Acceptance into the fellowship program is dependent on acceptance into the MA program in economics at George Mason University. The deadline for applications is March 1, 2015.
The Frédéric Bastiat Fellowship is a one-year competitive fellowship program for graduate students interested in pursuing a career in public policy. The aim of this fellowship is to introduce students to the Austrian, Virginia, and Bloomington schools of political economy as academic foundations for pursuing contemporary policy analysis. They will explore how this framework is utilized to analyze policy implications of a variety of topics, including the study of American capitalism, state and local policy, regulatory studies, technology policy, financial markets, and spending and budget.
It includes a quarterly stipend and travel and lodging to attend colloquia hosted by the Mercatus Center. It is a total award of up to $5,000 for the year. Acceptance into the fellowship program is dependent on acceptance into a graduate program at an accredited university. The deadline for applications is April 1, 2015.
]]>
https://techliberation.com/2015/02/02/money-for-graduate-students-who-love-liberty/feed/ 1 75378
The LAPD versus the First Amendment https://techliberation.com/2015/01/30/the-lapd-versus-the-first-amendment/ https://techliberation.com/2015/01/30/the-lapd-versus-the-first-amendment/#comments Fri, 30 Jan 2015 19:35:57 +0000 http://techliberation.com/?p=75374

Last month, my Mercatus Center colleague Brent Skorup published a major scoop: police departments around the country are scanning social media to assign people individualized “threat ratings” — green, yellow, or red. This week, police are complaining that the public is using social media to track them back.

LAPD Chief Charlie Beck has expressed concerns that Waze, the social traffic app owned by Google, could be used to target police officers. The National Sherriff’s Association has also complained about the app.

To be clear, Waze does not allow anybody to track individual officers. Users of the app can drop a pin on a map letting drivers know that there is police activity (or traffic jams, accidents, or traffic enforment cameras) in the area.

That’s it.

And police departments around the country frequently publicize their locations. They are essentially required to do so for sobriety checkpoints bySupreme Court order and NHTSA guidelines.

But in a letter to Google CEO Larry Page, Beck writes breathlessly that Waze “poses a danger to the lives of police officers in the United States.” The letter also (falsely) states that the app was used by Ismaaiyl Brinsley to kill two NYPD officers. The Associated Press notes that “Investigators do not believe he used Waze to ambush the officers, in part because police say Brinsley tossed his cellphone more than two miles from where he shot the officers.”

It’s somewhat rich of the LAPD to cite fear for its officers’ lives while the department is in possession of some 3408 assault rifles, 7 armored vehicles, and 3 grenade launchers.

In fact, what Waze poses a danger to is police department revenue. Drivers are using the app as a crowdsourced radar detector, as a means of avoiding traffic tickets. But unlike radar detectors, which have been outlawed in my home state of Virginia, Waze benefits from First Amendment protection.

The fundamental activity that Waze users are engaging in is speech. “Hey, there is a cop over there,” is protected speech under the First Amendment. As all LAPD officers must swear an oath affirming that they “will support and defend the Constitution of the United States,” it seems reasonable to expect the police chief not to stifle, by lobbying private corporations, the First Amendment rights of those citizens who choose to engage in this protected activity.

The Waze kerfuffle is a symptom of a longer-term breakdown in trust between police departments around the country and the publics they are sworn to protect and serve. This is a widely recognized problem, and some in the law enforcement community are working on strategies to remedy it.

But as long as departments continue to view the public as the enemy or even as a passive revenue source, not as the rightful recipients of their service and protection, we will continue to see the public respond by introducing technologies that protect users from the police’s arbitrary powers.

Fortunately, police complaints about Waze have backfired. Many smartphone users had no idea there was an app for avoiding speeding tickets until Beck and the Sherriff’s Association made it national news. As a result of the publicity, downloads of Waze have skyrocketed.

https://platform.twitter.com/widgets.js

This is how the modern world works, and it gives me great hope for the future.

]]>
https://techliberation.com/2015/01/30/the-lapd-versus-the-first-amendment/feed/ 1 75374
The MPAA still doesn’t get it https://techliberation.com/2014/12/15/the-mpaa-still-doesnt-get-it/ https://techliberation.com/2014/12/15/the-mpaa-still-doesnt-get-it/#comments Mon, 15 Dec 2014 18:23:02 +0000 http://techliberation.com/?p=75111

Last week, two very interesting events happened in the world of copyright and content piracy. First, the Pirate Bay, the infamous torrent hosting site, was raided by police and removed from the Internet. Pirate Bay co-founder Peter Sunde (who was no longer involved with the project) expressed his indifference to the raid; there was no soul left in the site, he said, and in any case, he is “pretty sure the next thing will pan out.”

Second, a leaked trove of emails from the Sony hack showed that the MPAA continues to pursue their dream of blocking websites that contribute to copyright infringement. With the failure of SOPA in 2012, the lobbying organization has pivoted to trying to accomplish the same ends through other means, including paying for state attorneys-general to attack Google for including some of these sites in their index. Over at TechDirt, Mike Masnick argues that some of this activity may have been illegal.

I’ll leave the illegality of the MPAA’s lobbying strategy for federal prosecutors to sort out, but like some others, I am astonished by the MPAA’s lack of touch with reality. They seem to believe that opposition to SOPA was a fluke, whipped up by Google, who they will be able to neutralize through their “Project Goliath.” And according to a meeting agenda reported on by TorrentFreak, they want to bring “on board ‘respected’ people in the technology sector to agree on technical facts and establish policy support for site blocking.”

The reality is that opposition to SOPA-style controls continues to remain strong in the tech policy community. The only people in Washington who support censoring the Internet to protect copyright are paid by Hollywood. If, through their generous war chest, the MPAA were able to pay a “respected” tech-sector advocate to build policy support for site blocking, that very fact would cause that person to lose respect.

Moreover, on a technical level, the MPAA is fighting a battle it is sure to lose. As Rick Falkvinge notes, the content industry had a unique opportunity in 1999 to embrace and extend Napster. Instead, it got Napster shut down, which eventually led to decentralized piracy over bittorrent. Now, it wants to shut down sites that index torrents, but torrent indexes are tiny amounts of data. The whole Pirate Bay index was only 90MB in 2012, and a magnet link for an individual torrent is only a few bytes. Between Bitmessage and projects like Bitmarkets, it seems extremely unlikely that the content industry will ever be able to shut down distribution of torrent data.

Instead of fighting this inevitable trend, the MPAA and RIAA should be trying to position themselves well in a world in which content piracy will always be possible. They should make it convenient for customers to access their paid content through bundling deals with companies like Netflix and Spotify. They should accept some background level of content piracy and embrace at least its buzz-generating benefits. They should focus on soft enforcement through systems like six strikes, which more gently nudge consumers to pay for content. And they should explicitly disavow any effort to censor the web—without such a disavowal, they are making enemies not just of tech companies, but of the entire community of tech enthusiasts and policy wonks.

]]>
https://techliberation.com/2014/12/15/the-mpaa-still-doesnt-get-it/feed/ 2 75111
3 takeaways from the Plenipot https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/ https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/#comments Thu, 13 Nov 2014 14:45:13 +0000 http://techliberation.com/?p=74962

Last week marked the conclusion of the ITU’s Plenipotentiary Conference, the quadrennial gathering during which ITU member states get together to revise the treaty that establishes the Union and conduct other high-level business. I had the privilege of serving as a member of the US delegation, as I did for the WCIT, and to see the negotiations first hand. This year’s Plenipot was far less contentious than the WCIT was two years ago. For other summaries of the conference, let me recommend to you Samantha Dickinson, Danielle Kehl, and Amb. Danny Sepulveda. Rather than recap their posts or the entire conference, I just wanted to add a couple of additional observations.

We mostly won on transparent access to documents

Through my involvement with WCITLeaks, I have closely followed the issue of access to ITU documents, both before and during the Plenipot. My assessment is that we mostly won.

Going forward, most inputs and outputs to ITU conferences and assemblies will be available to the public from the ITU website. This excludes a) working documents, b) documents related to other meetings such as Council Working Groups and Study Groups, and c) non-meeting documents that should be available to the public.

However, in February, an ITU Council Working Group will be meeting to develop what is likely to be a more extensive document access policy. In May, the whole Council will meet to provisionally approve an access policy. And in 2018, the next Plenipot will permanently decide what to do about this provisional access policy.

There are no guarantees, and we will need to closely monitor the outcomes in February and May to see what policy is adopted—but if it is a good one, I would be prepared to shut down WCITLeaks as it would become redundant. If the policy is inadequate, however, WCITLeaks will continue to operate until the policy improves.

I was gratified that WCITLeaks continued to play a constructive role in the discussion. For example, in the Arab States’ proposal on ITU document access, they cited us, considering “that there are some websites on the Internet which are publishing illegally to the public ITU documents that are restricted only to Member States.” In addition, I am told that at the CEPT coordination meeting, WCITLeaks was thanked for giving the issue of transparency at the ITU a shot in the arm.

A number of governments were strong proponents of transparency at the ITU, but I think special thanks are due to Sweden, who championed the issue on behalf of Europe. I was very grateful for their leadership.

The collapse of the WCIT was an input into a harmonious Plenipot

We got through the Plenipot without a single vote (other than officer elections)! That’s great news—it’s always better when the ITU can come to agreement without forcing some member states to go along.

I think it’s important to recognize the considerable extent to which this consensus agreement was driven by events at the WCIT in 2012. At the WCIT, when the US (and others) objected and said that we could not agree to certain provisions, other countries thought we were bluffing. They decided to call our bluff by engineering a vote, and we wisely decided not to sign the treaty, along with 54 other countries.

In Busan this month, when we said that we could not agree to certain outcomes, nobody thought we were bluffing. Our willingness to walk away at the WCIT gave us added credibility in negotiations at the Plenipot. While I also believe that good diplomacy helped secure a good outcome at the Plenipot, the occasional willingness to walk the ITU off a cliff comes in handy. We should keep this in mind for future negotiations—making credible promises and sticking to them pays dividends down the road.

The big question of the conference is in what form will the India proposal re-emerge

At the Plenipot, India offered a sweeping proposal to fundamentally change the routing architecture of the Internet so that a) IP addresses would be allocated by country, like telephone numbers, with a country prefix and b) domestic Internet traffic would never be routed out of the country.

This proposal was obviously very impractical. It is unlikely, in any case, that the ITU has the expertise or the budget to undertake such a vast reengineering of the Internet. But the idea would also be very damaging from the perspective of individual liberty—it would make nation-states, even more than the are now, mediators of human communication.

I was very proud that the United States not only made the practical case against the Indian proposal, it made a principled one. Amb. Sepulveda made a very strong statement indicating that the United States does not share India’s goals as expressed in this proposal, and that we would not be a part of it. This statement, along with those of other countries and subsequent negotiations, effectively killed the Indian proposal at the Plenipot.

The big question is in what form this proposal will re-emerge. The idea of remaking the Internet along national lines is unlikely to go away, and we will need to continue monitoring ITU study groups to ensure that this extremely damaging proposal does not raise its head.

]]>
https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/feed/ 1 74962
ITU agrees to open access for Plenipot contributions https://techliberation.com/2014/10/20/itu-agrees-to-open-access-for-plenipot-contributions/ https://techliberation.com/2014/10/20/itu-agrees-to-open-access-for-plenipot-contributions/#respond Mon, 20 Oct 2014 14:38:13 +0000 http://techliberation.com/?p=74862

Good news! As the ITU’s Plenipotentiary Conference gets underway in Busan, Korea, the heads of delegation have met and decided to open up access to some of the documents associated with the meeting. At this time, it is only the documents that are classified as “contributions“—other documents such as meeting agendas, background information, and terms of reference remain password protected. It’s not clear yet whether that is an oversight or an intentional distinction. While I would prefer all documents to be publicly available, this is a very welcome development. It is gratifying to see the ITU membership taking transparency seriously.

Special thanks are due to ITU Secretary-General Hamadoun Touré. When Jerry Brito and I launched WCITLeaks in 2012, at first, the ITU took a very defensive posture. But after the WCIT, the Secretary-General demonstrated tremendous leadership by becoming a real advocate for transparency and reform. I am told that he was instrumental in convincing the heads of delegation to open up access to Plenipot documents. For that, Dr. Touré has my sincere thanks—I would be happy to buy him a congratulatory drink when I arrive in Busan, although I doubt his schedule would permit it.

It’s worth noting that this decision only applies to the Plenipotentiary conference. The US has a proposal that will be considered at the conference to make something like this arrangement permanent, to instruct the incoming SG to develop a policy of open access to all ITU meeting documents. That is a development that I will continue to watch closely.

]]>
https://techliberation.com/2014/10/20/itu-agrees-to-open-access-for-plenipot-contributions/feed/ 0 74862
More evidence that ‘SOPA for Search Engines’ is a bad idea https://techliberation.com/2014/10/17/more-evidence-that-sopa-for-search-engines-is-a-bad-idea/ https://techliberation.com/2014/10/17/more-evidence-that-sopa-for-search-engines-is-a-bad-idea/#respond Fri, 17 Oct 2014 14:30:54 +0000 http://techliberation.com/?p=74856

Although SOPA was ignominiously defeated in 2012, the content industry never really gave up on the basic idea of breaking the Internet in order to combat content piracy. The industry now claims that a major cause of piracy is search engines returning results that direct users to pirated content. To combat this, they would like to regulate search engine results to prevent them from linking to sites that contain pirated music and movies.

This idea is problematic on many levels. First, there is very little evidence that content piracy is a serious concern in objective economic terms. Most content pirates would not, but for the availability of pirated content, empty their wallets to incentivize the creation of more movies and music. As Ian Robinson and I explain in our recent paper, industry estimates of the jobs created by intellectual property are absurd. Second, there are serious free speech implications associated with regulating search engine results. Search engines perform an information distribution role similar to that of newspapers, and they have an editorial voice. They deserve protection from censorship as long as they are not hosting the pirated material themselves. Third, as anyone who knows anything about the Internet knows, nobody uses the major search engines to look for pirated content. The serious pirates go straight to sites that specialize in piracy. Fourth, this is all part of a desperate attempt by the content industry to avoid modernizing and offering more of their content online through convenient packages such as Netflix.

As if these were not sufficient reason to reject the idea of “SOPA for Search Engines,” Google has now announced that they will be directing users to legitimate digital content if it is available on Netflix, Amazon, Google Play, Spotify, or other online services. The content industry now has no excuse—if they make their music and movies available in convenient form, users will see links to legitimate content even if they search for pirated versions.

star-trek-search-results

Google also says they will be using DMCA takedown notices as an input into search rankings and autocomplete suggestions, demoting sites and terms that are associated with piracy. This is above and beyond what Google needs to do, and in fact raises some concerns about fraudulent DMCA takedown notices that could chill free expression—such as when CBS issued a takedown of John McCain’s campaign ad on YouTube even though it was likely legal under fair use. Google will have to carefully monitor the DMCA takedown process for abuse. But in any case, these moves by Google should once and for all put the nail in the coffin of the idea that we should compromise the integrity of search results through government regulation for the sake of fighting a piracy problem that is not that serious in the first place.

]]>
https://techliberation.com/2014/10/17/more-evidence-that-sopa-for-search-engines-is-a-bad-idea/feed/ 0 74856
WCITLeaks is Ready for Plenipot https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/ https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/#respond Fri, 26 Sep 2014 19:23:16 +0000 http://techliberation.com/?p=74817

The ITU is holding its quadrennial Plenipotentiary Conference in Busan, South Korea from October 20 to November 7, 2014. The Plenipot, as it is called, is the ITU’s “supreme organ” (a funny term that I did not make up). It represents the highest level of decision making at the ITU. As it has for the last several ITU conferences, WCITLeaks will host leaked documents related to the Plenipot.

For those interested in transparency at the ITU, two interesting developments are worth reporting. On the first day of the conference, the heads of delegation will meet to decide whether documents related to the conference should be available to the public directly through the TIES system without a password. All of the documents associated with the Plenipot are already available in English on WCITLeaks, but direct public access would have the virtue of including those in the world who do not speak English but do speak one of the other official UN languages. Considering this additional benefit of inclusion, I hope that the heads of delegation will seriously consider the advantages of adopting a more open model for document access during this Plenipot. If you would like to contact the head of delegation for your country, you can find their names in this document. A polite email asking them to support open access to ITU documents might not hurt.

In addition, at the meeting, the ITU membership will consider a proposal from the United States to, as a rule, provide open access to all meeting documents.

open-access-ITU

This is what WCITLeaks has always supported—putting ourselves out of business. As the US proposal notes, the ITU Secretariat has conducted a study finding that other UN agencies are much more forthcoming in terms of public access to their documents. A more transparent ITU is in everyone’s interest—including the ITU’s. This Plenipot has the potential to remedy a serious deficiency with the institution; I’m cheering for them and hoping they get it right.

]]>
https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/feed/ 0 74817
You know how IP creates millions of jobs? That’s pseudoscientific baloney https://techliberation.com/2014/08/06/you-know-how-ip-creates-millions-of-jobs-thats-pseudoscientific-baloney/ https://techliberation.com/2014/08/06/you-know-how-ip-creates-millions-of-jobs-thats-pseudoscientific-baloney/#respond Wed, 06 Aug 2014 14:26:56 +0000 http://techliberation.com/?p=74678

In 2012, the US Chamber of Commerce put out a report claiming that intellectual property is responsible for 55 million US jobs—46 percent of private sector employment. This is a ridiculous statistic if you merely stop and think about it for a minute. But the fact that the statistic is ridiculous doesn’t mean that it won’t continue to circulate around Washington. For example, last year Rep. Marsha Blackburn cited it uncritically in an oped in The Hill.

In a new paper from Mercatus (here’s the PDF), Ian Robinson and I expose this statistic, and others like them, as pseudoscience. They are based on incredibly shoddy and misleading reasoning. Here’s the abstract of the paper:

In the past two years, a spate of misleading reports on intellectual property has sought to convince policymakers and the public that implausibly high proportions of US output and employment depend on expansive intellectual property (IP) rights. These reports provide no theoretical or empirical evidence to support such a claim, but instead simply assume that the existence of intellectual property in an industry creates the jobs in that industry. We dispute the assumption that jobs in IP-intensive industries are necessarily IP-created jobs. We first explore issues regarding job creation and the economic efficiency of IP that cut across all kinds of intellectual property. We then take a closer look at these issues across three major forms of intellectual property: trademarks, patents, and copyrights.

As they say, read the whole thing, and please share with your favorite IP maximalist.

]]>
https://techliberation.com/2014/08/06/you-know-how-ip-creates-millions-of-jobs-thats-pseudoscientific-baloney/feed/ 0 74678
Why Reclassification Would Make the Internet Less Open https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/ https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/#comments Thu, 15 May 2014 14:58:19 +0000 http://techliberation.com/?p=74555

There seems to be increasing chatter among net neutrality activists lately on the subject of reclassifying ISPs as Title II services, subject to common carriage regulation. Although the intent in pushing reclassification is to make the Internet more open and free, in reality such a move could backfire badly. Activists don’t seem to have considered the effect of reclassification on international Internet politics, where it would likely give enemies of Internet openness everything they have always wanted.

At the WCIT in 2012, one of the major issues up for debate was whether the revised International Telecommunication Regulations (ITRs) would apply to Operating Agencies (OAs) or to Recognized Operating Agencies (ROAs). OA is a very broad term that covers private network operators, leased line networks, and even ham radio operators. Since “OA” would have included IP service providers, the US and other more liberal countries were very much opposed to the application of the ITRs to OAs. ROAs, on the other hand, are OAs that operate “public correspondence or broadcasting service.” That first term, “public correspondence,” is a term of art that means basically common carriage. The US government was OK with the use of ROA in the treaty because it would have essentially cabined the regulations to international telephone service, leaving the Internet free from UN interference. What actually happened was that there was a failed compromise in which ITU Member States created a new term, Authorized Operating Agency, that was arguably somewhere in the middle—the definition included the word “public” but not “public correspondence”—and the US and other countries refused to sign the treaty out of concern that it was still too broad.

If the US reclassified ISPs as Title II services, that would arguably make them ROAs for purposes at the ITU (arguably because it depends on how you read the definition of ROA and Article 6 of the ITU Constitution). This potentially opens ISPs up to regulation under the ITRs. This might not be so bad if the US were the only country in the world—after all, the US did not sign the 2012 ITRs, and it does not use the ITU’s accounting rate provisions to govern international telecom payments.

But what happens when other countries start copying the US, imposing common carriage requirements, and classifying their ISPs as ROAs? Then the story gets much worse. Countries that are signatories to the 2012 ITRs would have ITU mandates on security and spam imposed on their networks, which is to say that the UN would start essentially regulating content on the Internet. This is what Russia, Saudia Arabia, and China have always wanted. Furthermore (and perhaps more frighteningly), classification as ROAs would allow foreign ISPs to forgo commercial peering arrangements in favor of the ITU’s accounting rate system. This is what a number of African governments have always wanted. Ethiopia, for example, considered a bill (I’m not 100 percent sure it ever passed) that would send its own citizens to jail for 15 years for using VOIP, because this decreases Ethiopian international telecom revenues. Having the option of using the ITU accounting rate system would make it easier to extract revenues from international Internet use.

Whatever you think of, e.g., Comcast and Cogent’s peering dispute, applying ITU regulation to ISPs would be significantly worse in terms of keeping the Internet open. By reclassifying US ISPs as common carriers, we would open the door to exactly that. The US government has never objected to ITU regulation of ROAs, so if we ever create a norm under which ISPs are arguably ROAs, we would be essentially undoing all of the progress that we made at the WCIT in standing up for a distinction between old-school telecom and the Internet. I imagine that some net neutrality advocates will find this unfair—after all, their goal is openness, not ITU control over IP service. But this is the reality of international politics: the US would have a very hard time at the ITU arguing that regulating for neutrality and common carriage is OK, but regulating for security, content, and payment is not.

If the goal is to keep the Internet open, we must look somewhere besides Title II.

]]>
https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/feed/ 1 74555
What Vox Doesn’t Get About the “Battle for the Future of the Internet” https://techliberation.com/2014/05/02/what-vox-doesnt-get-about-the-battle-for-the-future-of-the-internet/ https://techliberation.com/2014/05/02/what-vox-doesnt-get-about-the-battle-for-the-future-of-the-internet/#comments Fri, 02 May 2014 18:56:31 +0000 http://techliberation.com/?p=74487

My friend Tim Lee has an article at Vox that argues that interconnection is the new frontier on which the battle for the future of the Internet is being waged. I think the article doesn’t really consider how interconnection has worked in the last few years, and consequently, it makes a big deal out of something that is pretty harmless.

How the Internet used to work

The Internet is a network of networks. Your ISP is a network. It connects to the other ISPs and exchanges traffic with them. Since connections between ISPs are about equally valuable to each other, this often happens through “settlement-free peering,” in which networks exchange traffic on an unpriced basis. The arrangement is equally valuable to both partners.

Not every ISP connects directly to every other ISP. For example, a local ISP in California probably doesn’t connect directly to a local ISP in New York. If you’re an ISP that wants to be sure your customer can reach every other network on the Internet, you have to purchase “transit” services from a bigger or more specialized ISP. This would allow ISPs to transmit data along what used to be called “the backbone” of the Internet. Transit providers that exchange roughly equally valued traffic with other networks themselves have settlement-free peering arrangements with those networks.

How the Internet works now

A few things have changed in the last several years. One major change is that most major ISPs have very large, geographically-dispersed networks. For example, Comcast serves customers in 40 states, and other networks can peer with them in 18 different locations across the US. These 18 locations are connected to each other through very fast cables that Comcast owns. In other words, Comcast is not just a residential ISP anymore. They are part of what used to be called “the backbone,” although it no longer makes sense to call it that since there are so many big pipes that cross the country and so much traffic is transmitted directly through ISP interconnection.

Another thing that has changed is that content providers are increasingly delivering a lot of a) traffic-intensive and b) time-sensitive content across the Internet. This has created the incentive to use what are known as content-delivery networks (CDNs). CDNs are specialized ISPs that locate servers right on the edge of all terminating ISPs’ networks. There are a lot of CDNs—here is one list.

By locating on the edge of each consumer ISP, CDNs are able to deliver content to end users with very low latency and at very fast speeds. For this service, they charge money to their customers. However, they also have to pay consumer ISPs for access to their networks, because the traffic flow is all going in one direction and otherwise CDNs would be making money by using up resources on the consumer ISP’s network.

CDNs’ payments to consumer ISPs are also a matter of equity between the ISP’s customers. Let’s suppose that Vox hires Amazon CloudFront to serve traffic to Comcast customers (they do). If the 50 percent of Comcast customers who wanted to read Vox suddenly started using up so many network resources that Comcast and CloudFront needed to upgrade their connection, who should pay for the upgrade? The naïve answer is to say that Comcast should, because that is what customers are paying them for. But the efficient answer is that the 50 percent who want to access Vox should pay for it, and the 50 percent who don’t want to access it shouldn’t. By Comcast charging CloudFront to access the Comcast network, and CloudFront passing along those costs to Vox, and Vox passing along those costs to customers in the form of advertising, the resource costs of using the network are being paid by those who are using them and not by those who aren’t.

What happened with the Netflix/Comcast dust-up?

Netflix used multiple CDNs to serve its content to subscribers. For example, it used a CDN provided by Cogent to serve content to Comcast customers. Cogent ran out of capacity and refused to upgrade its link to Comcast. As a result, some of Comcast’s customers experienced a decline in quality of Netflix streaming. However, Comcast customers who accessed Netflix with an Apple TV, which is served by CDNs from Level 3 and Limelight, never had any problems. Cogent has had peering disputes in the past with many other networks.

To solve the congestion problem, Netflix and Comcast negotiated a direct interconnection. Instead of Netflix paying Cogent and Cogent paying Comcast, Netflix is now paying Comcast directly. They signed a multi-year deal that is reported to  reduce Netflix’s costs relative to what they would have paid through Cogent. Essentially, Netflix is vertically integrating into the CDN business. This makes sense. High-quality CDN service is essential to Netflix’s business; they can’t afford to experience the kind of incident that Cogent caused with Comcast. When a service is strategically important to your business, it’s often a good idea to vertically integrate.

It should be noted that what Comcast and Netflix negotiated was  not a “fast lane”—Comcast is prohibited from offering prioritized traffic as a condition of its merger with NBC/Universal.

What about Comcast’s market power?

I think that one of Tim’s hangups is that Comcast has a lot of local market power. There are lots of barriers to creating a competing local ISP in Comcast’s territories. Doesn’t this mean that Comcast will abuse its market power and try to gouge CDNs?

Let’s suppose that Comcast is a pure monopolist in a two-sided market. It’s already extracting the maximum amount of rent that it can on the consumer side. Now it turns to the upstream market and tries to extract rent. The problem with this is that it can only extract rents from upstream content producers insofar as it lowers the value of the rent it can collect from consumers. If customers have to pay higher Netflix bills, then they will be less willing to pay Comcast. The fact that the market is two-sided does not significantly increase the amount of monopoly rent that Comcast can collect.

Interconnection fees that are being paid to Comcast (and virtually all other major ISPs) have virtually nothing to do with Comcast’s market power and everything to do with the fact that the Internet has changed, both in structure and content. This is simply how the Internet works. I use CloudFront, the same CDN that Vox uses, to serve even a small site like my Bitcoin Volatility Index. CloudFront negotiates payments to Comcast and other ISPs on my and Vox’s behalf. There is nothing unseemly about Netflix making similar payments to Comcast, whether indirectly through Cogent or directly, nor is there anything about this arrangement that harms “the little guy” (like me!).

For more reading material on the Netflix/Comcast arrangement, I recommend Dan Rayburn’s posts here, here, and here. Interconnection is a very technical subject, and someone with very specialized expertise like Dan is invaluable in understanding this issue.

]]>
https://techliberation.com/2014/05/02/what-vox-doesnt-get-about-the-battle-for-the-future-of-the-internet/feed/ 1 74487
NETmundial wrap-up https://techliberation.com/2014/04/25/netmundial-wrap-up/ https://techliberation.com/2014/04/25/netmundial-wrap-up/#respond Fri, 25 Apr 2014 12:58:24 +0000 http://techliberation.com/?p=74444

NETmundial is over; here’s how it went down. Previous installments (1, 2, 3).

  • The final output of the meeting is available here. It is being referred to as the Multistakeholder Statement of São Paulo. I think the name is designed to put the document in contention with the Tunis Agenda. Insofar as it displaces the Tunis Agenda, that is fine with me.
  • Most of the civil society participants are not happy. Contrary to my prediction, in a terrible PR move, the US government (among others) weakened the language on surveillance. A statement on net neutrality also did not make it into the final draft. These were the top two issues for most of civil society participants.
  • I of course oppose US surveillance, but I am not too upset about the watered down language since I don’t see this as an Internet governance issue. Also, unlike virtually all of the civil society people, I oppose net neutrality laws, so I’m pleased with that aspect of the document.
  • What bothers me most in the final output are two statements that seem to have been snuck in at the last moment by the drafters without approval from others. These are real shenanigans. The first is on multistakeholderism. The Tunis language said that stakeholders should participate according to their “respective roles and responsibilities.” The original draft of the NETmundial document used the same language, but participants agreed to remove it, indicating that all stakeholders should participate equally and that no stakeholders were more special than others. Somehow the final document contained the sentence, “The respective roles and responsibilities of stakeholders should be interpreted in a flexible manner with reference to the issue under discussion.” I have no idea how it got in there. I was in the room when the final draft was approved, and that text was not announced.
  • Similarly, language in the “roadmap” portion of the document now refers to non-state actors in the context of surveillance. “Collection and processing of personal data by state and non-state actors should be conducted in accordance with international human rights law.” The addition of non-state actors was also done without consulting anyone in the final drafting room.
  • Aside from the surveillance issue, the other big mistake by the US government was their demand to weaken the provision on intermediary liability. As I understand it, their argument was that they didn’t want to consider safe harbor for intermediaries without a concomitant recognition of the role of intermediaries in self-policing, as is done through the notice-and-takedown process in the US. I would have preferred a strong, free-standing statement on intermediary liability, but instead, the text was replaced with OECD language that the US had previously agreed to.
  • Overall, the meeting was highly imperfect—it was non-transparent, disorganized, inefficient in its use of time, and so on. I don’t think it was a rousing success, but it was nevertheless successful enough that the organizers were able to claim success, which I think was their original goal. Other than the two last-minute additions that I saw (I wonder if there are others), nothing in the document gives me major heartburn, so maybe that is actually a success. It will be interesting to see if the São Paulo Statement is cited in other fora, and if they decide to repeat this process again next year.
]]>
https://techliberation.com/2014/04/25/netmundial-wrap-up/feed/ 0 74444
NETmundial day 2 notes https://techliberation.com/2014/04/24/netmundial-day-2-notes/ https://techliberation.com/2014/04/24/netmundial-day-2-notes/#respond Thu, 24 Apr 2014 12:07:41 +0000 http://techliberation.com/?p=74434

Today is the second and final day of NETmundial and the third in my series (parts 1 and 2) of quick notes on the meeting.

  • Yesterday, Dilma Rousseff did indeed sign the Marco Civil into law as expected. Her appearance here began with the Brazilian national anthem, which is a very strange way to kick off a multistakeholder meeting.
  • The big bombshell in Rousseff’s speech was her insistence that the multilateral model can peacefully coexist with the multistakeholder model. Brazil had been making a lot of pro-multistakeholder statements, so many of us viewed this as something of a setback.
  • One thing I noticed during the speech was that the Portuguese word for “multistakeholder” actually literally translates as “multisectoral.” This goes a long way toward explaining some of the disconnect between Brazil and the liberals. Multisectoral means that representatives from all “sectors” are welcome, while multistakeholder implies that every stakeholder is welcome to participate, even if they sometimes organize into constituencies. This is a pretty major difference, and NETmundial has been organized on the former model.
  • The meeting yesterday got horribly behind schedule. There were so many welcome speeches, and they went so much over time, that we did not even begin the substantive work of the conference until 5:30pm. I know that sounds like a joke, but it’s not.
  • After three hours of substantive work, during which participants made 2-minute interventions suggesting changes to the text, a drafting group retreated to a separate room to work on the text of the document. The room was open to all participants, but only the drafting group was allowed to work on the drafting; everyone else could only watch (and drink).
  • As of this morning, we still don’t have the text that was negotiated last night. Hopefully it will appear online some time soon.
  • One thing to watch for is the status of the document. Will it be a “declaration” or a “chairman’s report” (or something else)? What I’m hearing is that most of the anti-multistakeholder governments like Russia and China want it to be a chairman’s report because that implies a lesser claim to legitimacy. Brazil, the hosts of the conference, presumably want to make a maximal claim to legitimacy. I tend to think that there’s enough wrong with the document that I’d prefer the outcome to be a chairman’s report, but I don’t feel too strongly.
]]>
https://techliberation.com/2014/04/24/netmundial-day-2-notes/feed/ 0 74434
NETmundial is about to begin https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/ https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/#respond Wed, 23 Apr 2014 12:55:09 +0000 http://techliberation.com/?p=74431

As I blogged last week, I am in São Paulo to attend NETmundial, the meeting on the future of Internet governance hosted by the Brazilian government. The opening ceremony is about to begin. A few more observations:

  • The Brazilian Senate passed the landmark Marco Civil bill last night, and Dilma Rousseff, the Brazilian president, may use here appearance here today to sign it into law. The bill subjects data stored on Brazilians anywhere in the world to Brazilian jurisdiction and imposes net neutrality domestically. It also provides a safe harbor for ISPs and creates a notice-and-takedown system for offensive content.
  • Some participants are framing aspects of the meeting, particularly the condemnation of mass surveillance in the draft outcome document, as civil society v. the US government. There is a lot of concern that the US will somehow water down the surveillance language so that it doesn’t apply to the NSA’s surveillance. WikiLeaks has stoked some of this concern with breathless tweets. I don’t see events playing out this way. I am as opposed to mass US surveillance as anyone, but I haven’t seen much resistance from the US government participants in this regard. Most of the comments by the US on the draft have been benign. For example, WikiLeaks claimed that the US “stripped” language referring to the UN Human Rights Council; in fact, the US hasn’t stripped anything because it is not in charge (it can only make suggestions), and eliminating the reference to the HRC is actually a good idea because the HRC is a multilateral, not a multistakeholder, body. I expect a strong anti-surveillance statement to be included in the final outcome document. If it is not, it will probably be other governments, not the US, that block it.
  • In my view, the privacy section of the draft still needs work, however. In particular, it is important to cabin the paragraph to address governmental surveillance, not to interfere with voluntary, private arrangements in which users disclose information to receive free services.
  • I expect discussions over net neutrality to be somewhat contentious. Civil society participants are generally for it, with some governments, businesses, parts of the technical community, and yours truly opposed.
  • Although surveillance and net neutrality have received a lot of attention, they are not the important issues at NETmundial. Instead, look for the language that will affect “the future of Internet governance,” which is after all what the meeting is about. For example, will the language on stakeholders’ “respective roles and responsibilities” be stricken? This is language held over from the Tunis Agenda and it has a lot of meaning. Do stakeholders participate as equals or do they, especially governments, have separate roles? There is also a paragraph on “enhanced cooperation,” which is a codeword for governments running the show. Look to see in the final draft if it is still there.
  • Speaking of the final draft, here is how it will be produced: During the meeting, participants will have opportunities to make 2-minute interventions on specific topics. The drafting group will make note of the comments and then retreat to a drafting room to make final edits to the draft. This is, of course, not really the open governance process that many of us want for the Internet, one where select, unaccountable participants have the final say. Yet two days is not a long enough time to really have an open, free-wheeling drafting conference. I think the structure of the conference, driven by the perceived need to produce an outcome document with certainty, is unfortunate and somewhat detracts from the legitimacy of whatever will be produced, even though I expect the final document to be OK on substance.
]]>
https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/feed/ 0 74431
Pre-NETmundial Notes https://techliberation.com/2014/04/18/pre-netmundial-notes/ https://techliberation.com/2014/04/18/pre-netmundial-notes/#comments Fri, 18 Apr 2014 14:29:46 +0000 http://techliberation.com/?p=74411

Next week I’ll be in São Paulo for the NETmundial meeting, which will discuss “the future of Internet governance.” I’ll blog more while I’m there, but for now I just wanted to make a few quick notes.

  • This is the first meeting of its kind, so it’s difficult to know what to expect, in part because it’s not clear what others’ expectations are. There is a draft outcome document, but no one knows how significant it will be or what weight it will carry in other fora.
  • The draft outcome document is available here. The web-based tool for commenting on individual paragraphs is quite nice. Anyone in the world can submit comments on a paragraph-by-paragraph basis. I think this is a good way to lower the barriers to participation and get a lot of feedback.
  • I worry that we won’t have enough time to give due consideration to the feedback being gathered. The meeting is only two days long. If you’ve ever participated in a drafting conference, you know that this is not a lot of time. What this means, unfortunately, is that the draft document may be something of a fait accompli. Undoubtedly it will change a little, but the amount of changes that can be contemplated will be limited due to sheer time constraints.
  • Time will be even more constrained by the absurd amount of time allocated to opening ceremonies and welcome remarks. The opening ceremony begins at 9:30 am and the welcome remarks are not scheduled to conclude until 1 pm on the first day. This is followed by a lunch break, and then a short panel on setting goals for NETmundial, so that the first drafting session doesn’t begin until 2:30 pm. This seems like a mistake.
  • Speaking of the agenda, it was not released until yesterday. While NETmundial has indeed been open to participation by all, it has not been very transparent. An earlier draft outcome document had to be leaked by WikiLeaks on April 8. Not releasing an agenda until a few days before the event is also not very transparent. In addition, the processes by which decisions have been made have not been transparent to outsiders.

See you all next week.

]]>
https://techliberation.com/2014/04/18/pre-netmundial-notes/feed/ 2 74411
New Paper on the Cybersecurity Framework https://techliberation.com/2014/04/17/new-paper-on-the-cybersecurity-framework/ https://techliberation.com/2014/04/17/new-paper-on-the-cybersecurity-framework/#respond Thu, 17 Apr 2014 14:46:24 +0000 http://techliberation.com/?p=74409

Andrea Castillo and I have a new paper out from the Mercatus Center entitled “Why the Cybersecurity Framework Will Make Us Less Secure.” We contrast emergent, decentralized, dynamic provision of security with centralized, technocratic cybersecurity plans. Money quote:

The Cybersecurity Framework attempts to promote the outcomes of dynamic cybersecurity provision without the critical incentives, experimentation, and processes that undergird dynamism. The framework would replace this creative process with one rigid incentive toward compliance with recommended federal standards. The Cybersecurity Framework primarily seeks to establish defined roles through the Framework Profiles and assign them to specific groups. This is the wrong approach. Security threats are constantly changing and can never be holistically accounted for through even the most sophisticated flowcharts. What’s more, an assessment of DHS critical infrastructure categorizations by the Government Accountability Office (GAO) finds that the DHS itself has failed to adequately communicate its internal categories with other government bodies. Adding to the confusion is the proliferating amalgam of committees, agencies, and councils that are necessarily invited to the table as the number of “critical” infrastructures increases. By blindly beating the drums of cyber war and allowing unfocused anxieties to clumsily force a rigid structure onto a complex system, policymakers lose sight of the “far broader range of potentially dangerous occurrences involving cyber-means and targets, including failure due to human error, technical problems, and market failure apart from malicious attacks.” When most infrastructures are considered “critical,” then none of them really are.

We argue that instead of adopting a technocratic approach, the government should take steps to improve the existing emergent security apparatus. This means declassifying information about potential vulnerabilities and kickstarting the cybersecurity insurance market by buying insurance for federal agencies, which experienced 22,000 breaches in 2012. Read the whole thing, as they say.

]]>
https://techliberation.com/2014/04/17/new-paper-on-the-cybersecurity-framework/feed/ 0 74409
How to Privatize the Internet https://techliberation.com/2014/04/02/how-to-privatize-the-internet/ https://techliberation.com/2014/04/02/how-to-privatize-the-internet/#comments Wed, 02 Apr 2014 15:52:08 +0000 http://techliberation.com/?p=74378

Today on Capitol Hill, the House Energy and Commerce Committee is holding a hearing on the NTIA’s recent announcement that it will relinquish its small but important administrative role in the Internet’s domain name system. The announcement has alarmed some policymakers with a well-placed concern for the future of Internet freedom; hence the hearing. Tomorrow, I will be on a panel at ITIF discussing the IANA oversight transition, which promises to be a great discussion.

My general view is that if well executed, the transition of the DNS from government oversight to purely private control could actually help secure a measure of Internet freedom for another generation—but the transition is not without its potential pitfalls.

The NTIA’s technical administration of the DNS’ “root zone” is an artifact of the Internet’s origins as a U.S. military experiment. In 1989, the government began the process of privatizing the Internet by opening it up to general and commercial use. In 1998, the Commerce Department created ICANN to oversee the DNS on a day-to-day basis. The NTIA’s announcement is arguably the culmination of this single decades-long process of privatization.

The announcement also undercuts the primary justification used by authoritarian regimes to agitate for control of the Internet. Other governments have long cited the United States’ unilateral control of the root zone, arguing that they, too, should have roles in governing the Internet. By relinquishing its oversight of the DNS, the United States significantly undermines that argument and bolsters the case for private administration of the Internet.

The United States’ stewardship of the root zone is largely apolitical. This apolitical approach to DNS administration is precisely what is at stake during the transition, hence the three pitfalls the Obama administration must avoid to preserve it.

The first pitfall is the most serious but also the least likely to materialize. Despite the NTIA’s excellent track record, authoritarian regimes like Russia, China, and Iran have long lobbied for the ITU, a clumsy and heavily politicized U.N. technical agency, to take over the NTIA’s duties. In its announcement, the NTIA said it would not accept a proposal from an intergovernmental organization, a clear rebuke to the ITU.

Nevertheless, liberal governments would be wise to send the organization a clear message in the form of much-needed reform. The ITU should adopt the transparency we expect of communications standards bodies, and it should focus on its core competency—international coordination of radio spectrum—instead of on Internet governance. If the ITU resists these reforms at its Plenipotentiary Conference this fall, the United States and other countries should slash funding or quit the Union.

ICANN’s Governmental Advisory Committee (GAC) presents a second pitfall. Indeed, the GAC is already the source of much mischief. For example, France and Luxembourg objected to the creation of the .vin top-level domain on the grounds that “vin” (wine) is a regulated term in those countries. Brazil and Peru have held up Amazon.com’s application for .amazon despite the fact that they previously agreed to the list of reserved place names, and rivers and states were not on it. Last July, the U.S. government, reeling from the Edward Snowden revelations, threw Amazon and the rule of law under the bus at the GAC as a conciliatory measure.

ICANN created the GAC to appease other governments in light of the United States’ outsized role. Since the United States is giving up its special role, the case for the GAC is much diminished. In practice, the limits on the GAC’s power are gradually eroding. ICANN’s board seems increasingly hesitant to overrule it out of fear that governments will go back to the ITU and complain that the GAC “isn’t working.” As part of the transition of the root zone to ICANN, therefore, new limits need to be placed on the GAC’s power. Ideally, it would dissolve the GAC.

The third pitfall comes from ICANN itself. The organization is awash in cash from domain registration fees and new top-level domain name applications—which cost $185,000 each—and when the root zone transition is completed, it will face no external accountability. Long-time ICANN insiders speak of “mission creep,” noting that the supposedly purely technical organization increasingly deals with trademark policy and has aided police investigations in the past, a dangerous precedent.

How can we prevent an unaccountable, cash-rich technical organization from imposing its own internal politics on what is supposed to be an apolitical administrative role? In the long run, we may never be able to stop ICANN from becoming a government-like entity, which is why it is important to support research and experimentation in peer-to-peer, decentralized domain name systems. This matter is under discussion, among other places, at the Internet Engineering Task Force, which may ultimately play something of a counterweight to an independent ICANN.

Despite these potential pitfalls, it is time for an Internet that is fully in private hands. The Obama administration deserves credit for proposing to complete the privatization of the Internet, but we must also carefully monitor the process to intercept any blunders that might result in politicization of the root zone.

]]>
https://techliberation.com/2014/04/02/how-to-privatize-the-internet/feed/ 3 74378
Toward a Post-Government Internet https://techliberation.com/2014/03/17/toward-a-post-government-internet/ https://techliberation.com/2014/03/17/toward-a-post-government-internet/#comments Mon, 17 Mar 2014 13:41:53 +0000 http://techliberation.com/?p=74294

The Internet began as a U.S. military project. For two decades, the government restricted access to the network to government, academic, and other authorized non-commercial use. In 1989, the U.S. gave up control—it allowed private, commercial use of the Internet, a decision that allowed it to flourish and grow as few could imagine at the time.

Late Friday, the NTIA announced its intent to give up the last vestiges of its control over the Internet, the last real evidence that it began as a government experiment. Control of the Domain Name System’s (DNS’s) Root Zone File has remained with the agency despite the creation of ICANN in 1998 to perform the other high-level domain name functions, called the IANA functions.

The NTIA announcement is not a huge surprise. The U.S. government has always said it eventually planned to devolve IANA oversight, albeit with lapsed deadlines and changes of course along the way.

The U.S. giving up control over the Root Zone File is a step toward a world in which governments no longer assert oversight over the technology of communication. Just as freedom of the printing press was important to the founding generation in America, an unfettered Internet is essential to our right to unimpeded communication. I am heartened to see that the U.S. will not consider any proposal that involves IANA oversight by an intergovernmental body.

Relatedly, next month’s global multistakeholder meeting in Brazil will consider principles and roadmaps for the future of Internet governance. I have made two contributions to the meeting, a set of proposed high-level principles that would limit the involvement of governments in Internet governance to facilitating participation by their nationals, and a proposal to support experimentation in peer-to-peer domain name systems. I view these proposals as related: the first keeps governments away from Internet governance and the second provides a check against ICANN simply becoming another government in control of the Internet.

]]>
https://techliberation.com/2014/03/17/toward-a-post-government-internet/feed/ 1 74294
TacoCopters are Legal (for Now) https://techliberation.com/2014/03/07/tacocopters-are-legal-for-now/ https://techliberation.com/2014/03/07/tacocopters-are-legal-for-now/#comments Fri, 07 Mar 2014 16:08:17 +0000 http://techliberation.com/?p=74283

Yesterday, an administrative judge ruled in Huerta v. Pirker that the FAA’s “rules” banning commercial drones don’t have the force of law because the agency never followed the procedures required to enact them as an official regulation. The ruling means that any aircraft that qualifies as a “model aircraft” plausibly operates under laissez-faire. Entrepreneurs are free for now to develop real-life TacoCopters, and Amazon can launch its Prime Air same-day delivery service.

Laissez-faire might not last. The FAA could appeal the ruling, try to issue an emergency regulation, or simply wait 18 months or so until its current regulatory proceedings culminate in regulations for commercial drones. If they opt for the last of these, then the drone community has an interesting opportunity to show that regulations for small commercial drones do not pass a cost-benefit test. So start new drone businesses, but as Matt Waite says, “Don’t do anything stupid. Bad actors make bad policy.”

Kudos to Brendan Schulman, the attorney for Pirker, who has been a tireless advocate for the freedom to innovate using drone technology. He is on Twitter at @dronelaws, and if you’re at all interested in this issue, he is a great person to follow.

]]>
https://techliberation.com/2014/03/07/tacocopters-are-legal-for-now/feed/ 2 74283
What’s Wrong with Two-Sided Markets? https://techliberation.com/2014/02/24/whats-wrong-with-two-sided-markets/ https://techliberation.com/2014/02/24/whats-wrong-with-two-sided-markets/#respond Mon, 24 Feb 2014 14:53:47 +0000 http://techliberation.com/?p=74267

It seems to me that a lot of the angst about the Comcast-Netflix paid transit deal results from a general discomfort with two-sided markets rather than any specific harm caused by the deal. But is there any reason to be suspicious of two-sided markets per se?

Consider a (straight) singles bar. Men and women come to the singles bar to meet each other. On some nights, it’s ladies’ night, and women get in free and get a free drink. On other nights, it’s not ladies’ night, and both men and women have to pay to get in and buy drinks.

There is no a priori reason to believe that ladies’ night is more just or efficient than other nights. The owner of the bar will benefit if the bar is a good place for social congress, and she will price accordingly. If men in the area are particularly shy, she may have to institute a “mens’ night” to get them to come out. If women start demanding too many free drinks, she may have to put an end to ladies’ night (even if some men benefit from the presence of tipsy women, they may not be as willing as the women to pay the full cost of all of the drinks). Whether a market should be two-sided or one-sided is an empirical question, and the answer can change over time depending on circumstances.

Some commentators seem to be arguing that two-sided markets are fine as long as the market is competitive. Well, OK, suppose the singles bar is the only singles bar in a 100-mile radius? How does that change the analysis above? Not at all, I say.

Analysis of two-sided markets can get very complex, but we shouldn’t let that complexity turn into reflexive opposition.

]]>
https://techliberation.com/2014/02/24/whats-wrong-with-two-sided-markets/feed/ 0 74267
Announcing btcvol.info, Your One-Stop Shop for Bitcoin Volatility Data https://techliberation.com/2014/02/19/announcing-btcvol-info-your-one-stop-shop-for-bitcoin-volatility-data/ https://techliberation.com/2014/02/19/announcing-btcvol-info-your-one-stop-shop-for-bitcoin-volatility-data/#respond Wed, 19 Feb 2014 15:26:11 +0000 http://techliberation.com/?p=74260

The volatility of Bitcoin prices is one of the strongest headwinds the currency faces. Unfortunately, until my quantitative analysis last month, most of the discussion surrounding Bitcoin volatility so far has been anecdotal. I want to make it easier for people to move beyond anecdotes, so I have created a Bitcoin volatility index at btcvol.info, which I’m hoping can become or inspire a standard metric that people can agree on.

The volatility index at btcvol.info is based on daily closing prices for Bitcoin as reported by CoinDesk. I calculate the difference in daily log prices for each day in the dataset, and then calculate the sample standard deviation of those daily returns for the preceding 30 days. The result is an estimate of how spread out daily price fluctuations are—volatility.

The site also includes a basic API, so feel free to integrate this volatility measure into your site or use it for data analysis.

I of course hope that Bitcoin volatility becomes much lower over time. I expect both the maturing of the ecosystem as well as the introduction of a Bitcoin derivatives market will cause volatility to decrease. Having one or more volatility metrics will help us determine whether these or other factors make a difference.

You can support btcvol.info by spreading the word or of course by donating via Bitcoin to the address at the bottom of the site.

]]>
https://techliberation.com/2014/02/19/announcing-btcvol-info-your-one-stop-shop-for-bitcoin-volatility-data/feed/ 0 74260
Tomorrow: Event on Patent Reform https://techliberation.com/2014/02/03/tomorrow-event-on-patent-reform/ https://techliberation.com/2014/02/03/tomorrow-event-on-patent-reform/#respond Mon, 03 Feb 2014 15:03:43 +0000 http://techliberation.com/?p=74233

I am speaking on a panel tomorrow at the Dirksen Senate Office Building at an R Street Institute event on patent reform. Here’s R Street’s description:

The patent reform debate has been painted as one of inventors vs. patent troll victims. Yet these two don’t have to be enemies. We can protect intellectual property, and stomp out patent trolls. If you’re just tuning in, patent trolls are entities that hoard overly broad patents, but do not use them to make goods or services, or advance a useful secondary market. While there’s a place for patent enforcement, these guys take it way too far. These entities maliciously threaten small businesses, inventors, and consumers, causing tens of billions in economic damage each year. Since litigation costs millions of dollars, businesses are forced to settle even when the claim against them is spurious. Fortunately, with growing awareness and support, the patent trolls’ lucrative racket is in jeopardy. With Obama’s patent troll task force, the passage of the Innovation Act in the House, state legislation tackling demand letters, and further action in the courts, we appear to be closer than ever to achieving real reform. Please join us for a lunch and panel discussion of the nature of the patent troll problem, the industries it affects, and the policy solutions being considered. Featuring: Zach Graves, Director of Digital Marketing & Policy Analyst, R Street Institute (Moderator) Eli Dourado, Research Fellow, Mercatus Center Whitaker L. Askew, Vice President, American Gaming Association Robin Cook, Assistant General Counsel for Special Projects, Credit Union National Association Julie Hopkins, Partner, Tydings & Rosenberg LLP

The festivities begin at noon. The event is open to the public, and you can register here.

]]>
https://techliberation.com/2014/02/03/tomorrow-event-on-patent-reform/feed/ 0 74233
Want Drones for the Little Guy? Don’t Overregulate https://techliberation.com/2013/12/17/drones-for-the-little-guy/ https://techliberation.com/2013/12/17/drones-for-the-little-guy/#respond Tue, 17 Dec 2013 20:34:12 +0000 http://techliberation.com/?p=74003

In an op-ed at CNN, Ryan Calo argues that the real drone revolution will arrive when ordinary people can own and operate app-enabled drones. Rather than being dominated by a few large tech companies, drones should develop along the lines of the PC model: they should be purchasable by consumers and they should run third-party software or apps.

The real explosion of innovation in computing occurred when devices got into the hands of regular people. Suddenly consumers did not have to wait for IBM or Apple to write every software program they might want to use. Other companies and individuals could also write a “killer app.” Much of the software that makes personal computers, tablets and smartphones such an essential part of daily life now have been written by third-party developers. […] Once companies such as Google, Amazon or Apple create a personal drone that is app-enabled, we will begin to see the true promise of this technology. This is still a ways off. There are certainly many technical, regulatory and social hurdles to overcome. But I would think that within 10 to 15 years, we will see robust, multipurpose robots in the hands of consumers.

I agree with Ryan that a world where only big companies can operate drones is undesirable. His vision of personal drones meshes well with my argument in Wired that we should see airspace as a platform for innovation.

This is why I am concerned about the overregulation of drones. Big companies like Amazon, Apple, and Google will always have legal departments that will enable them to comply with drone regulations. But will all of us? There are economies of scale in regulatory compliance. If we’re not careful, we could regulate the little guy out of drones entirely—and then only big companies will be able to own and operate them. This is something I’m looking at closely in advance of the FAA proceedings on drones in 2014.

]]>
https://techliberation.com/2013/12/17/drones-for-the-little-guy/feed/ 0 74003
How Much Carbon Does It Take to Keep Ben Bernanke Alive? https://techliberation.com/2013/12/17/bitcoin-carbon/ https://techliberation.com/2013/12/17/bitcoin-carbon/#respond Tue, 17 Dec 2013 16:14:11 +0000 http://techliberation.com/?p=74000

Everyone seems to be worried about Bitcoin’s carbon footprint lately. Last week, an article on Quartz claimed that Bitcoin miners are spending $17 million per day on electricity in order to reap $4.4 million worth of bitcoins. And Yesterday, Pando Daily ran a piece that ominously warned about Bitcoin’s carbon footprint.

One problem with both of these pieces is that they seem to rely on electricity consumption estimates from blockchain.info. While this site is great for getting stats about the Bitcoin network, it’s not such a great site for estimating electricity consumption. Blockchain.info clearly states that it is using an estimate of 650 Watts per gigahash [per second, I assume] in its electricity calculations. While this may have been a good estimate of the efficiency of the Bitcoin network when the page was first created, the network has become much more efficient since then. Archive.org shows that the 650W/GH/s figure was used on the earliest cached copy of the page, from December 2, 2011; yes, that is over two years ago.

Furthermore, we can use data from current-generation mining hardware to see how absurd the 650W/GH/s number is. In recent months, the Bitcoin network has mostly switched to application-specific integrated circuits, or ASICs. These devices are much more efficient at mining than previous generations of hardware. A look at this table of mining hardware shows that ASICs all seem to mine at less than 10W/GH/s. Some discontinued models seem to mine as efficiently as 2W/GH/s, and some models that are shipping next year will use less than 0.5W/GH/s. Not everyone in the Bitcoin network is using the latest-generation models of ASICs, and of course botnet mining is based on stealing electricity, so it’s not likely that the network averages 2W/GH/s or less. Nevertheless, it seems that the electricity estimates that these articles are based on may be off by a factor of close to 100.

Furthermore, we should always ask “compared to what?” Yes, the Bitcoin network uses a lot of electricity, but the computations that use this electricity are used to clear transactions, move money around the blockchain, increment the money supply, etc. In order to make a fair comparison to non-cryptocurrency payment systems, we need to ask how many resources (and how much carbon) is used to keep those systems going. And I think the answer is quite a lot. Banks, too, use computers, sometimes ancient ones, to process transactions. Furthermore, humans use a lot of carbon. Since our financial system uses a lot more human intervention than Bitcoin, much of those humans’ carbon use is due to the financial system. (Another way to put this is that if we all switched to cryptocurrency, those humans would get other jobs and produce other social benefits in exchange for the carbon used to keep them alive.) And there are of course costs of physically moving cash around, for example on armored trucks.

The relevant calculations are admittedly difficult, but it seems quite possible to me, when all is accounted for, that Bitcoin is the green alternative to Federal Reserve Notes. Cryptoanarchy and the environment don’t have to be enemies.

]]>
https://techliberation.com/2013/12/17/bitcoin-carbon/feed/ 0 74000
Crovitz Nails It on Software Patents and the Federal Circuit https://techliberation.com/2013/12/16/crovitz-nails-it-on-software-patents-and-the-federal-circuit/ https://techliberation.com/2013/12/16/crovitz-nails-it-on-software-patents-and-the-federal-circuit/#respond Mon, 16 Dec 2013 16:38:42 +0000 http://techliberation.com/?p=73994

Gordon Crovitz has an excellent column in today’s Wall Street Journal in which he accurately diagnoses the root cause of our patent litigation problem: the Federal Circuit’s support for extensive patenting in software.

Today’s patent mess can be traced to a miscalculation by Jimmy Carter, who thought granting more patents would help overcome economic stagnation. In 1979, his Domestic Policy Review on Industrial Innovation proposed a new Federal Circuit Court of Appeals, which Congress created in 1982. Its first judge explained: “The court was formed for one need, to recover the value of the patent system as an incentive to industry.” The country got more patents—at what has turned out to be a huge cost. The number of patents has quadrupled, to more than 275,000 a year. But the Federal Circuit approved patents for software, which now account for most of the patents granted in the U.S.—and for most of the litigation. Patent trolls buy up vague software patents and demand legal settlements from technology companies. Instead of encouraging innovation, patent law has become a burden on entrepreneurs, especially startups without teams of patent lawyers.

I was pleased that Crovitz cites my new paper with Alex Tabarrok:

A system of property rights is flawed if no one can know what’s protected. That’s what happens when the government grants 20-year patents for vague software ideas in exchange for making the innovation public. In a recent academic paper, George Mason researchers Eli Dourado and Alex Tabarrok argued that the system of “broad and fuzzy” software patents “reduces the potency of search and defeats one of the key arguments for patents, the dissemination of information about innovation.”

Current legislation in Congress makes changes to patent trial procedure in an effort to reduce the harm caused by patent trolling. But if we really want to solve the trolling problem once and for all, and to generally have a healthy and innovative patent system, we need to get at the problem of low-quality patents, especially in software. The best way to do that is to abolish the Federal Circuit, which has consistently undermined limits on patentable subject matter.

]]>
https://techliberation.com/2013/12/16/crovitz-nails-it-on-software-patents-and-the-federal-circuit/feed/ 0 73994
Using Bayes’s Rule to Think About a Bitcoin Bubble https://techliberation.com/2013/12/13/bitcoin-bubble-bayes/ https://techliberation.com/2013/12/13/bitcoin-bubble-bayes/#respond Fri, 13 Dec 2013 18:21:06 +0000 http://techliberation.com/?p=73981

Is there a Bitcoin bubble? Jason Kuznicki thinks so and believes that he has conclusive proof. He blogs three graphs that show more or less that there is a lot of speculation in Bitcoin. But does speculation prove that there’s a bubble? Let’s use Bayes’s rule to think about this carefully.

Bayes’s rule is a mathematical tool for thinking about the incorporation of new evidence into subjective probabilities. Let’s suppose that there is some proposition A for which you have a prior belief. Somebody offers evidence B for or against A. How much should you change your belief in A based on evidence B?

Bayes’s rule boils the answer down to a simple mathematical form:

P(A|B) = P(B|A)\dfrac{P(A)}{P(B)}

In English, the probability of A  given B equals the probability of B given A, times the probability of A and divided by the probability of B.

So to evaluate Jason’s argument and see how much we should change our estimate of a Bitcoin bubble based on the evidence that there’s speculation, we can simply assign the proposition and the evidence to A and B. In this case, A is the proposition that there’s a bubble, and B is the evidence that there’s speculation in Bitcoin. If we figure out our subjective probabilities for B|A and B, we can use those to determine how different P(A|B) should be from P(A).

So what is B|A? Since B is the evidence that there is speculation in Bitcoin and A is the proposition that there is a bubble, B|A simply states the proposition that  given that there is a bubble, there is speculation. It seems pretty much impossible to have a bubble without speculation, so I’ll go with a subjective probability of 1. Picking a different value here will only work against Jason’s argument.

So what is the probability of B, the fact that there is speculation in Bitcoin? The Bitcoin ecosystem isn’t built out yet. Most of the protocol’s most exciting uses haven’t even seen the light of day yet. As I blogged last week, multisignature transactions are barely in use yet, but they form the foundation for a decentralized architecture of arbitration. Ed Felten at Princeton is working on decentralized prediction markets. Jerry Brito points to microtransactions, or even nearly-continuous transactions, as another exciting future use scenario.

Given that we don’t know whether this ecosystem will ever materialize, holders of bitcoin are  necessarily speculating. If the ecosystem matures and is useful, bitcoins will be worth something. If none of these innovations come about, or if we decide they’re not that useful after all, then bitcoins will probably be worth nothing. There’s no way out of speculating, because we simply don’t know for sure if the ecosystem will come along. Almost the entire “fundamental value” of Bitcoin rests on future events.

So the probability of B, I think, is 1. When P(B|A) is 1, and P(B) is 1, what does Bayes’s rule reduce to?

P(A|B) = P(A)

B simply offers no information as to whether A is true.

A similar argument can be made when Bitcoin’s volatility is offered as evidence of a bubble. Bitcoin is a thinly-traded asset where supply does not adjust to accommodate demand. It is going to be volatile. So the fact that Bitcoin is volatile adds no new information to the question of whether it’s a bubble.

What  does provide information? I think the most reliable evidence is on the maturation (or not) of the Bitcoin ecosystem. If Bitcoin seemed static right now, I would interpret that as evidence of a bubble. But it doesn’t. Every day, people are working to build businesses that leverage some of the unique features of Bitcoin’s protocol. As long as that continues, I think it’s most reasonable to be highly agnostic about the correct price of Bitcoin.

]]>
https://techliberation.com/2013/12/13/bitcoin-bubble-bayes/feed/ 0 73981
Stop Saying Bitcoin Transactions Aren’t Reversible https://techliberation.com/2013/12/04/bitcoin-arbitration/ https://techliberation.com/2013/12/04/bitcoin-arbitration/#respond Wed, 04 Dec 2013 19:31:35 +0000 http://techliberation.com/?p=73917

One of the criticisms leveled at Bitcoin by those people determined to hate it is that Bitcoin transactions are irreversible. If I buy goods from an anonymous counterparty online, what’s to stop them from taking my bitcoins and simply not sending me the goods? When I buy goods online using Visa or American Express, if the goods never arrive, or if they aren’t what was advertised, I can complain to the credit card company. The company will do a cursory investigation, and if they find that I was indeed likely ripped off, they will refund me my money. Credit card transactions are reversible, Bitcoin transactions are not. For this service (among others), credit card companies charge merchants a few percentage points on the transaction.

The problem with this account is that it’s not true: Baked into the Bitcoin protocol, there is support for what are known as “m-of-n” or “multisignature” transactions, transactions that require some number  m out of some higher number n parties to sign off.

The simplest variant is a 2-of-3 transaction. Let’s say that I want to buy goods online from an anonymous counterparty. I transfer money to an address jointly controlled by me, the counterparty, and a third-party arbitrator (maybe even Amex). If I get the goods, they are acceptable, and I am honest, I sign the money away to the seller. The seller also signs, and since 2 out of 3 of us have signed, he receives his money. If there is a problem with the goods or if I am dishonest, I sign the bitcoins back to myself and appeal to the arbitrator. The arbitrator, like a credit card company, will do an investigation, make a ruling, and either agree to transfer the funds back to me or to the merchant; again, 2 of 3 parties must agree to transfer the funds.

This is  not an escrow service; at no point can the arbitrator abscond with the funds. The arbitrator is paid a market rate in advance for his services, which are offered according to terms agreed upon by all three parties. This is better than the equivalent service using credit cards, because credit cards rely on huge network effects and consequently there are only a handful of suppliers of such transaction arbitration. Using Bitcoin, anyone can be an abitrator, including the traditional credit card companies (although they might have to lower their fees). Competition in both terms and fees is likely to result in better discovery of efficient rules for dispute resolution.

While multisignature transactions are not well understood, they are right there in the Bitcoin protocol, as much a valid Bitcoin transaction as any other. So  some Bitcoin transactions are irreversible; others are reversible, exactly as reversible as credit card transactions are.

Bitrated.com is a new site (announced yesterday on Hacker News) that facilitates setting up multisignature transactions. Bitcoin client support for multisignature transactions is limited, so the site helps create addresses that conform to the m-of-n specifications. At no point does the site have access to the funds in the multisignature address.

In addition, Bitrated provides a marketplace where people can advertise their arbitration services. Users are able to set up transactions using arbitrators both from the site or from anywhere else. The entire project is open source, so if you want to set up a competing directory, go for it.

What excites me most about the decentralized arbitration afforded by multisignature transactions is that it could be the beginnings of a Common Law for the Internet. The plain, ordinary Common Law developed as the result of competing courts that issued opinions basically as advertisements of how fair and impartial they were. We could see something similar with Bitcoin arbitration. If arbitrators sign their transactions with links to and a cryptographic hash of a PDF that explains why they ruled as they did, we could see real competition in the articulation of rules. Over time, some of these articulations could come to be widely accepted and form a body of Bitcoin precedent. I look forward to reading the subsequent Restatements.

Multisignature transactions are just one of the many innovations buried deep in the Bitcoin protocol that have yet to be widely utilized. As the community matures and makes full use of the protocol, it will become more clear that Bitcoin is not just a currency but a platform for financial innovation.

Originally posted at elidourado.com.

]]>
https://techliberation.com/2013/12/04/bitcoin-arbitration/feed/ 0 73917
The Great Disintermediation https://techliberation.com/2013/12/02/the-great-disintermediation/ https://techliberation.com/2013/12/02/the-great-disintermediation/#comments Mon, 02 Dec 2013 20:30:29 +0000 http://techliberation.com/?p=73902

Yesterday at Forbes, William Pentland had an interesting piece on possible disintermediation in the electricity market.

In New York and New England, the price of electricity is a function of the cost of natural gas plus the cost of the poles and wires that carry electrons from remotely-sited power plants to end users. It is not unusual for customers to spend two dollars on poles and wires for every dollar they spend on electrons. The poles and wires that once reduced the price of electricity for end users are now doing the opposite. To make matters worse, electricity supplied through the power grid is frequently less reliable than electricity generated onsite. In other words, rather than adding value in the form of enhanced reliability, the poles and wires diminish the reliability of electricity.

If two thirds of the cost of electricity is the distribution mechanism, then, as Pentland notes, there is a palpable opportunity to switch to at-home electricity generation. Some combination of solar power, batteries, and natural gas-fired backup generators could displace the grid entirely for some customers. And if I understand my electricity economics correctly, if a significant fraction of customers go off-grid, the fixed cost of maintaining the grid will be split over fewer remaining customers, making centrally-generated electricity even more expensive. The market for such electricity could quickly unravel.

While it remains to be seen whether electricity generation will indeed become decentralized, such disintermediation would be the continuation of a decades-long social trend. It all began (plausibly) in 1984. The Macintosh was released, and desktop computing became a thing. Desktop printers disintermediated printing departments, Kinkos, and the steno pool. The Internet has disintermediated telephone companies, music labels, television networks, newspapers, and much more. Online education is unbundling university courses.

What’s even more exciting is the next generation of disintermediating technologies. Bitcoin could displace some financial institutions—to varying degrees, banks, the Federal Reserve, Western Union, and credit card companies. Mesh networks could solve the last-mile problem of Internet service delivery, which tends to be monopolized or at least concentrated. 3D printers could disintermediate supply chains. 3D chemical printers could disintermediate drug companies and the FDA.

Delivery drones like Amazon Prime Air‘s arguably disrupt package delivery services, though not entirely because FedEx and UPS will still run drone-utilizing distribution networks. More importantly, delivery drones disintermediate the real estate market for small businesses. It will no longer be important, if you run a local business, to have a storefront in a prime location. Your customers can order online and items can be delivered to them in half an hour straight from the factory or artisanal workshop. It could be the Etsyfication of the economy.

If information, electricity, money, and production all get disintermediated, what is left? If these trends continue, the future will be one in which human interaction is unmediated, and to a surprising degree, unregulable. It will be difficult to stop a willing buyer and seller from transacting. Information about the proposed transaction might not be censorable. Payment via Bitcoin or other cryptocurrencies can’t be stopped. Production and delivery of the item may be difficult or impossible to detect and intercept.

Intermediaries are often used by governments as points of control. As we shed intermediaries, it may become possible to live one’s entire life without any particular authority even knowing that one exists. I doubt that we’ll ever get that far in the process, because using non-abusive intermediaries often makes economic sense. But for the next few decades, at least, I expect the trend to continue and the world to get a lot more interesting.

Originally posted at elidourado.com

]]>
https://techliberation.com/2013/12/02/the-great-disintermediation/feed/ 1 73902