Articles by Ryan Radia

Ryan is associate director of technology studies at the Competitive Enterprise Institute, where his work focuses on adapting law and policy to the unique challenges of the information age. His research areas include privacy, IP telecommunications, competition policy, and media regulation.


By Ryan Radia & Berin Szoka

Today a broad array of civil liberties groups, think tanks, and technology companies launched the Digital Due Process coalition. The coalition’s mission is to educate lawmakers and the public about the need to update U.S. privacy laws to better safeguard individual information online and ensure that federal privacy statutes accurately reflect the realities of the digital age.

Over 20 organizations belong to the Digital Due Process coalition, including such odd bedfellows as AT&T, Google, Microsoft, the Center for Democracy & Technology, the American Civil Liberties Union, the Electronic Frontier Foundation, The Progress & Freedom Foundation (where Berin works), the Competitive Enterprise Institute (where Ryan works), the Internet Technology & Innovation Foundation, Citizens Against Government Waste, and Americans for Tax Reform. The full member list is available at the coalition’s website.

Amidst the heated tech policy wars, it’s not every day that such a diverse group of organizations comes together to endorse a unified set of core principles for legislative reform. Over two years in the making, the Digital Due Process coalition, spearheaded by the Center for Democracy & Technology, is a testament to the broad consensus that’s emerged among business leaders, activists, and scholars regarding the inadequacies of the current legal regime intended to protect Americans’ privacy from government snooping and the need for Congress to revisit decades-old privacy statutes. It also represents a revival of a bipartisan consensus on the need for reform reached back in 2000, when the Republican-led House Judiciary Committee voted 20-1 to approve very similar reforms (HR 5018).

Today, in the digital age, robust privacy laws are more important than ever. That’s because U.S. courts have been unwilling to extend the Fourth Amendment’s protection against unreasonable search and seizure to individual information stored with third parties such as cloud computing providers. Thus, while government authorities must get a search warrant based on probable cause before they can lawfully rifle through documents stored in your desk, basement, or safe deposit box, information you store on the cloud enjoys no Constitutional protection. (Some legal scholars argue this interpretation of the Fourth Amendment, referred to as the Third Party Doctrine, is outdated and deficient. See, for example, Jim Harper’s excellent 2008 article in the American University Law Review.)

Continue reading →

Should ISPs be barred under net neutrality from discriminating against illegal content? Not according to the FCC’s draft net neutrality rule, which defines efforts by ISPs to curb the “transfer of unlawful content” as reasonable network management. This exemption is meant to ensure providers have the freedom to filter or block unlawful content like malicious traffic, obscene files, and copyright-infringing data.

EFF and Public Knowledge (PK), both strong advocates of net neutrality, are not happy about the copyright infringement exemption. The groups have urged the FCC to reconsider what they describe as the “copyright loophole,” arguing that copyright filters amount to “poorly designed fishing nets.”

EFF’s and PK’s concerns about copyright filtering aren’t unreasonable. While filtering technology has come a long way over the last few years, it remains a fairly crude instrument for curbing piracy and suffers from false positives. That’s because it’s remarkably difficult to accurately distinguish between unauthorized copyrighted works and similar non-infringing files. And because filters generally flag unauthorized copies on an automated basis without human intervention, even when filters get it right, they often disrupt legal, non-infringing uses of copyrighted material like fair use.

Despite copyright filtering technology’s imperfections, however, outlawing it is the wrong approach. At its core, ISP copyright filtering represents a purely private, voluntary method of dealing with the great intellectual property challenge. This is exactly the sort of approach advocates of limited government should embrace. As Adam and Wayne argued back in 2001:

To lessen the reliance on traditional copyright protections, policymakers should ensure that government regulations don’t stand in the way of private efforts to protect intellectual property.

Continue reading →

A couple weeks ago the Google Books Settlement fairness hearing took place in New York City, where Judge Denny Chin heard dozens of oral arguments discussing the settlement’s implications for competition, copyright law, and privacy. The settlement raises a number of very challenging legal questions, and Judge Chin’s decision, expected to come down later this spring, is sure to be a page-turner no matter how he rules.

My work on the Google Books Settlement has focused on reader privacy concerns, which have been a major point of contention between Google and civil liberties groups like EFF, ACLU, and CDT. While I agree with these groups that existing legal protections for sensitive user information stored by cloud computing providers are inadequate, I do not believe that reader privacy should factor into the court’s decision on whether to approve or reject the settlement.

I elaborated on reader privacy in an amicus curiae brief I submitted to the court last September. I argued that because Google Books will likely earn a sizable portion of its revenues from advertising, placing strict limits on data collection (as EFF and others have advocated) would undercut Google’s incentive to scan books, ultimately hurting the very authors whom the settlement is supposed to benefit. While the settlement is not free from privacy risks, such concerns aren’t unique to Google Books nor are they any more serious than the risks surrounding popular Web services like Google search and Gmail. Comparing Google Book Search to brick-and-mortar libraries is inapt, and like all cloud computing providers, Google has a strong incentive to safeguard user data and use it only in ways that benefit users and advertisers.

Continue reading →

It’s been a busy week in the Googlesphere. Google made headlines earlier this week when it aired a televised ad for the first time in the company’s history, and again yesterday when it unveiled Buzz, its new social networking platform. Today, Google announced bold plans to build an experimental fiber-to-the-home broadband network that’s slated to eventually deliver a whopping gigabit per second of Internet connectivity to 500,000 U.S. homes.

Google’s ambitious broadband announcement comes as welcome news for anybody who pines for greater broadband competition and, more broadly, infrastructure wealth creation in America. To date, Google has dabbled in broadband in the form of metro Wi-Fi, but hasn’t embarked on anything of this scale. Laying fiber to residences is not cheap or easy, as Verizon has learned the hard way, and Google will undoubtedly have to devote some serious resources to this experiment if it is to realize its lofty goals.

It’s important to remember, however, that Google is first and foremost a content company, not an infrastructure company. Google’s generally awesome products, from search to video to email, attract masses of loyal users. In turn, advertisers flock to Google, spending billions in hopes of reaching its gigantic, precisely-targetable audience. This business model enables Google to invest in developing a steady stream of free services, like Google Voice, Google Apps, and Google Maps Navigation.

So it won’t be too surprising if Google’s broadband experiment doesn’t initially generate enough revenue to cover its costs. In fact, I’m skeptical that Google even anticipates its network will ever become a profit center. Rather, chances are Google won’t be at all concerned if its broadband service doesn’t break even as long as it bolsters the Google brand and spurs larger telecom companies to get more aggressive in upgrading their broadband speeds (which, indirectly, benefits Google).

Google’s broadband agenda is great news for consumers, of course. Who can complain if Google is willing to invest in building a fiber-to-the-home broadband network and is willing to charge below-cost prices? Not me!

Continue reading →

The Ticketmaster-Live Nation antitrust saga has come to a bittersweet end. Earlier this week the Justice Department finally approved the merger between the two firms, just shy of one year after it was announced.

While a number antitrust experts had speculated that the Justice Department might seek an injunction to block the deal outright, the DoJ ultimately opted to approve the deal while subjecting Ticketmaster-Live Nation to several conditions that are supposed to promote competition in the events marketplace. Under the terms of the consent decree, the combined firm will be required to license its ticketing software to competitor Anschutz Entertainment Group and divest Paciolan, a ticketing subsidiary of Ticketmaster. Ticketmaster-Live Nation also faces ten years of monitoring by antitrust officials to “prevent anticompetitive bundling of services.”

Ticketmaster has long been a controversial firm among concertgoers, frequently drawing consumers’ ire for charging hefty “convenience” fees and offering customer service that’s not exactly stellar. But it’s important to remember that today’s entertainment market is more fragmented than ever, and consumers have a huge array of choices for listening to music and viewing live events. Even YouTube is getting into the business of airing live events. The video site has broadcast several live events already, including U2’s Rose Bowl performance in October 2009, and is eyeing the pay-per-view live streaming market as well.

So it’s not hard to see why consolidation is taking place in the event ticketing and promotion markets. Economists have demonstrated that vertical integration, done properly, often results in sizable efficiencies, translating into overall welfare gains for consumers. Together, Ticketmaster and Live Nation are in a stronger position than before to offer value to event venues and promote concerts and shows. And as much we all hate service fees, in industries characterized by high fixed costs and declining marginal unit costs – like ticketing – big per-unit “markups” are often necessary to induce businesses to compete and innovate. While Ticketmaster may not be the most innovative company in the world, the firm faces an uncertain future as its contracts with venues come up for renewal. If Ticketmaster really is harming concertgoers – and by the way, there’s no clear evidence that it is – it will be disciplined not only by concert lovers, but by venues and artists as well. Derailing a potentially efficient business arrangement simply because it might not work out, whether in the event ticketing market or the cable television market, results in harm to consumers.

Continue reading →

Last Wednesday, Holman Jenkins penned a column in The Wall Street Journal about net neutrality (Adam discussed it here). In response, I have a letter to the editor in today’s The Wall Street Journal:

To the Editor:

Mr. Jenkins suggests that Google would likely “shriek” if a startup were to mount its servers inside the network of a telecom provider. Google already does just that. It is called “edge caching,” and it is employed by many content companies to keep costs down.

It is puzzling, then, why Google continues to support net neutrality. As long as Google produces content that consumers value, they will demand an unfettered Internet pipe. Political battles aside, content and infrastructure companies have an inherently symbiotic relationship.

Fears that Internet providers will, absent new rules, stifle user access to content are overblown. If a provider were to, say, block or degrade YouTube videos, its customers would likely revolt and go elsewhere. Or they would adopt encrypted network tunnels, which route around Internet roadblocks.

Not every market dispute warrants a government response. Battling giants like Google and AT&T can resolve network tensions by themselves.

Ryan Radia

Competitive Enterprise Institute

Washington

To be sure, the market for residential Internet service is not all that competitive in some parts of the country — Rochester, New York, for instance — so a provider might in some cases be able to get away with unsavory practices for a sustained period without suffering the consequences. Yet ISP competition is on the rise, and a growing number of Americans have access to three or more providers. This is especially true in big cities like Chicago, Baltimore, and Washington D.C.

Instead of trying to put a band-aid on problems that stem from insufficient ISP competition, the FCC should focus on reforming obsolete government rules that prevent ISP competition from emerging. Massive swaths of valuable spectrum remain unavailable to would-be ISP entrants, and municipal franchising rules make it incredibly difficult to lay new wire in public rights-of-way for the purpose of delivering bundled data and video services.

FOXNews.com has just published an editorial that I penned about Monday’s net neutrality announcement from the FCC.

Does Obama Want to Control the Internet?

by Ryan Radia

The federal government may gain broad new powers to regulate InternetObama Economy providers next month if Federal Communications Commission Chairman Julius Genachowski gets his way. In a milestone speech on Monday, Genachowski proposed sweeping new regulations that would give the FCC the formal authority to dictate application and network management practices to companies that offer Internet access, including wireless carriers like AT&T and Verizon Wireless.

Genachowski’s proposed rules would make good on a pledge that President Obama made in his campaign to enshrine net neutrality as law. The announcement was met with cheers by a small but vocal crowd of activists and academics who have been pushing hard for net neutrality for years. But if bureaucrats and politicians truly care about neutrality, they would be wise to resist calls to expand the government’s power over private networks. Instead, policymakers should recognize that it is far more important for government to remain neutral to competing business models — open, closed, or any combination thereof.

Continue reading →

September 8 — this Tuesday — is the deadline for filing objections against the Google Book Settlement. A number of trade associations, corporations, authors, and advocacy groups have weighed in, including thebook-385_609771a Electronic Frontier Foundation and the American Civil Liberties Union. They argue that approving the Google Book Settlement in its current form, without explicitly spelling out data collection practices, would endanger user privacy. EFF and ACLU have threatened to file an objection to the Settlement unless Google commits to a stringent privacy policy for Google Book Search.

I think the privacy risks posed by Google Book Search are being blown out of proportion, as I explained in the Examiner Opinion Zone last month. While EFF and others have raised some legitimate fears about the possibility of government getting its hands on Google Book Search user data, these privacy concerns are not unique to Google Book Search, nor are they legitimate grounds for the court to reject the Google Book Settlement.

In a letter I submitted yesterday as an amicus curiae brief to U.S. District Judge Denny Chin, who is presiding over the Google Books case, I argue that privacy concerns should not determine the court’s evaluation of the Settlement:

Competitive Enterprise Institute Letter

A number of conservative blogs have picked up on reports that the Obama administration is looking to data mine users on social networking sites. Reports CNS News:flag_at_whitehouse_gov

Anyone who posts comments on the White House’s Facebook, MySpace, YouTube and Twitter pages will have their statements captured and permanently archived by the federal government, according to a plan that the White House is now seeking a contractor to carry out.

Whenever government is collecting information about private citizens, we should be concerned. But this controversy smells a lot like privacy fear-mongering, even though it involves government. If you post a comment to an “official” Obama administration page on a social networking site, it seems only natural that it’s fair game for data mining. The same goes if you post a video response on a publicly accessible site.

If you’re posting controversial statements online under your real name for the public to see, what do you expect will happen? Anybody in the world who has an Internet connection can log your postings, so why shouldn’t government officials be able to do the same? Until government starts pressuring Facebook or Myspace to hand over data that’s being collected on an involuntary basis, I don’t see a whole lot here to worry about.

This controversy, and the flap over flag@whitehouse.gov from a few weeks back, raise another interesting question: should Congress reexamine the Presidential Records Act (PRA) of 1978? This is the law that governs Presidential record-keeping. According to some commentators, if the administration solicits data on its critics, it is obligated under the PRA to retain that data indefinitely. I haven’t read the law, but at first glance it appears that it may have some serious deficiencies. This is is hardly surprising, of course, given that the Internet — let alone social networks — didn’t even exist when the PRA was enacted in 1978.

You’d think that in 2009, when global networks are handling exabytes of data in a single day and OC192 fiber optic connections crisscross the planet, the FCC — the most important communications agency in the United States — would at least be able to use modern technology to stream its own public meetings.tlf image realplayer

Nope. The FCC is still streaming its webcasts with RealPlayer, a horrendous and arguably obsolete application that fell out of favor with techies years ago and has since been overtaken by superior streaming platforms like Adobe’s Flash Media Server.

Today’s big tech news item is the FCC’s “three-pronged probe” of the wireless industry, which was set to be announced today at this morning’s Open Commission Meeting.

Want to watch the FCC’s meeting and see what our “public servants” in Washington are up to? Good luck. The FCC’s streaming video server only supports 200 simultaneous connections.

In a nation of 270 million wireless users, why not offer, say, 1000 or even 10000 connections? Given the agency’s $339 million dollar budget that’s not too much to ask, is it?

It’s especially ironic that the FCC still struggles with streaming webcasts given that the FCC is launching an investigation of alleged “anti-competitive” practices in the wireless industry. Why isn’t the FCC investigating its own inability to accomplish relatively simple tasks, like stream live video or run a halfway decent website?

The FCC doesn’t just use RealPlayer for Open Commission Meetings. Even the FCC’s “Broadband Workshops” — which are supposedly going to guide the future of broadband deployment in America — are using the same tired streaming platform.

Of course, in the grand scheme of things, the platform the FCC uses for streaming video isn’t all that important. But it is a much-needed reminder that bureaucrats in Washington aren’t very good at keeping pace with modern technology. Unfortunately, many seem to have forgotten this fact.

ADDENDUM: Turns out the FCC does use a modern platform for streaming open commission meeting, Cisco Webex Webinar (accessible via www.broadband.gov) but only offers RealPlayer streams on the official FCC.gov website. Also, once meetings are finished, they are available online exclusively in the Real video format.