December 2008

This is just a listing of the installments of my ongoing “Media Deconsolidation Series.” I needed to create a single repository of all the essays so I could point back to them in future articles and papers. For those not familiar with it, this series represents an effort to set the record straight regarding the many myths surrounding the media marketplace. These myths are usually propagated by a group of radical anti-media regulatory activists who I call the “media reformistas.” Sadly, however, many policymakers, journalists, and members of the public are buying into some of these myths, too.

In particular, I have spent much time here debunking the notion that rampant consolidation is taking place and that media operators are only growing larger and devouring more and more companies. In fact, nothing could be further from the truth. Over the past several years, traditional media operators and sectors have been coming apart at the seams in the face of unprecedented innovation and competition. The volume of divestiture activity has been quite intense, and most traditional media operators have been getting smaller, not bigger. As a result, America’s media marketplace is growing more fragmented and atomistic with each passing day.

Anyway, here’s the series so far…

Continue reading →

You can tell I like my writing when I take a sentence from a post and make it the title.

Annnyway, my brief comment on the whistleblower who outed “Stellar Wind” is on the Cato@Liberty blog.

Very happy to see the discussion over The Wall Street Journal‘s Google/net neutrality story. Always good to see holes poked and the truth set free.

But let’s not allow the eruptions, backlashes, recriminations, and “debunkings” — This topic has been debunked. End of story. Over. Sit down! — obscure the still-fundamental issues. This is a terrific starting point for debate, not an end.

Content delivery networks (CDNs) and caching have always been a part of my analysis of the net neutrality debate. Here was testimony that George Gilder and I prepared for a Senate Commerce Committee hearing almost five years ago, in April 2004, where we predicted that a somewhat obscure new MCI “network layers” proposal, as it was then called, would be the next big communications policy issue. (At about the same time, my now-colleague Adam Thierer was also identifying this as an emerging issue/threat.)

Gilder and I tried to make the point that this “layers” — or network neutrality — proposal would, even if attractive in theory, be very difficult to define or implement. Networks are a dynamic realm of ever-shifting bottlenecks, where bandwidth, storage, caching, and peering, in the core, edge, and access, in the data center, on end-user devices, from the heavens and under the seas, constantly require new architectures, upgrades, and investments, thus triggering further cascades of hardware, software, and protocol changes elsewhere in this growing global web. It seemed to us at the time, ill-defined as it was, that this new policy proposal was probably a weapon for one group of Internet companies, with one type of business model, to bludgeon another set of Internet companies with a different business model. 

We wrote extensively about storage, caching, and content delivery networks in the pages of the Gilder Technology Report, first laying out the big conceptual issues in a 1999 article, “The Antediluvian Paradigm.” [Correction: “The Post-Diluvian Paradigm”] Gilder coined a word for this nexus of storage and bandwidth: Storewidth. Gilder and I even hosted a conference, also dubbed “Storewidth,” dedicated to these storage, memory, and content delivery network technologies. See, for instance, this press release for the 2001 conference with all the big players in the field, including Akamai, EMC, Network Appliance, Mirror Image, and one Eric Schmidt, chief executive officer of . . . Novell. In 2002, Google’s Larry Page spoke, as did Jay Adelson, founder of the big data-center-network-peering company Equinix, Yahoo!, and many of the big network and content companies. Continue reading →

Claims that Google has abandoned its stance on network neutrality have been thoroughly debunked, as Cord and Adam note below. Over at Broadband Reports, Karl Bode explains that Google is seeking edge-caching agreements, not preferential treatment. Edge-caching involves Google housing its content on servers located inside consumer ISP networks, cutting bandwidth costs by allowing users to access Google content located just a few hops away.

Even though edge-caching doesn’t violate network neutrality as defined by Google, it’s still one of the many advantages that big players have over new entrants. Edge-caching isn’t a “fast track,” as the WSJ imprecisely terms it, but rather a short track—functionally, there’s a lot of similarity between the two. As Richard Bennett has explained time and time again, being close to end users is quite advantageous even without preferential treatment, as it eliminates the need to push vast amounts of data across the congestion-prone core of the public Internet.

We’ve heard about how edge-caching enables content providers and ISPs to cut their bandwidth bills and make more efficient use of finite network resources. Both of these are true, but there’s more—edge caching makes it much less likely that users will experience long load times or buffering hiccups while watching streaming video online. That high-def YouTube clip might take a few extra seconds to buffer if it has to make its way through congested central network exchanges—not so, however, if that video is housed just a few hops away, within your ISP’s network.

Continue reading →

Over just the past 24 hours, there’s been quite a hullabaloo surrounding the Wall Street Journal’s controversial front-page story on Google’s edge caching plan and whether it violates Net neutrality. (See Cord’s post and Bret’s). Lessig calls it a “made-up drama“, David Isenberg says it’s “bogus” and “bullshit,” and Google’s Rick Whitt has said it’s much ado about nothing.

Regardless, here’s the important thing not to overlook about this episode: It is a prime example of the what Tim Lee has referred to as “the fundamental problem of backlash” that ensues whenever there is even a hint of a potential violation of network neutrality (however one defines it). As Tim argued in his excellent Cato paper on Net neutrality, “No widespread manipulation would go unnoticed for very long,” and a “firestorm of controversy would… be unleashed if a major network owner embarked on a systematic campaign of censorship on its network.” (p. 23). Indeed, this (non-)story about Google’s edge-caching plans have spawned an intense “firestorm of controversy” over the past 24 hours and it doesn’t even involve serious network meddling or censorship! I’ve been trying to keep up with all the traffic about this on TechMeme and Google News during that time, but I have given up trying to digest it all. (Take a look at those snapshots I pasted down below to get a feel for the volume we are talking about here).

In that regard, I love this quote from the always-bloodthirsty Tim Karr of the (inappropriately-named) regulatory activist group Free Press:

If Google or any other tech company were secretly violating Net Neutrality, there would be an absolute and cataclysmic backlash from the grassroots and netroots who have made Net Neutrality a signature issue in 21st Century politics. The Internet community would come crashing down on their heads like Minutemen on Benedict Arnold.

Indeed, that’s exactly what we saw today. But it wasn’t just pro-regulatory fanatics like Free Press. The entire tech and business blogoshere and even some of the mainstream media were on top of this. That’s the “fundamental problem of backlash” at work, and with a vengeance.

TechMeme Google headlines

Google headlines 2

The Wall Street Journal reports today that Google wants a fast lane on the Internet and has claimed that the Mountain View based giant may be moving away from its stance on network neutrality:

Google’s proposed arrangement with network providers, internally called OpenEdge, would place Google servers directly within the network of the service providers, according to documents reviewed by the Journal.

The problem with the Journal piece is that OpenEdge isn’t exactly a neutrality violation, or maybe it is.  As Declan McCullagh at CNET has pointed out in his post “Google accused of turning its back on Net neutrality,” figuring out when a neutrality violation has occurred is a little tricky:

The problem with defining Net neutrality so the government can regulate it is a little like the problem of defining obscenity so the government can ban it: You know it when you see it.

Well, Google says that it knows a neutrality violation when it sees one and not surprisingly it doesn’t see one in its own actions.  It’s defense essentially boils down to them pointing out that OpenEdge is caching.   It’s more of a warehouse than a fast lane.  Besides, anyone else can do the same thing, so Google isn’t using any ISP’s “unilateral control over consumers’ broadband connections” to their advantage.

Interestingly, however, the same Google’s Policy Blog entry defends other companies that engage in the same sort of caching, including LimeLight.  But LimeLight Networks isn’t just a data warehousing company,  they combine caching with real fast lanes.

Continue reading →

Masnick on the Music Tax

by on December 15, 2008 · 8 comments

I’m more sympathetic to EFF-style voluntary collective licensing than Mike Masnick is, but I have to say that the case he makes here is pretty compelling. I think this is really the key point:

What you’re doing is setting up a big, centrally planned and operated bureau of music, that officially determines the business model of the recording industry, figures out who gets paid, collects the money and pays some money out. The same record industry that has fought so hard against any innovation remains in charge and will have tremendous sway in setting the “rules.” The plan leaves no room for creativity. It leaves no room for innovation. It’s basically picking the only business model and encoding it in stone.

Oh, and did we mention it’s only for music? Next we’ll have to create another huge bureaucracy and “license” for movies. And for television. And, what about non-television, non-movie video content? Surely the Star Wars kid deserves his cut? And, newspapers? Can’t forget the newspapers. After all, they need the money, so we might as well add a license for news. And, if that’s going to happen, then certainly us bloggers should get our cut as well. Everyone, line right up!
This is a bad plan that will create a nightmare bureaucracy while making people pay a lot more, without doing much to actually reward musicians.

The key thing to remember here is that there’s nothing special about the music industry. The record labels have been hardest hit by peer-to-peer file sharing, but their fundamental problem actually has very little to do with BitTorrent. Rather, their problem is the same problem that’s befallen the newspaper industry: the marginal cost of content has dropped to zero, and so the price of content is also going to be driven to zero sooner or later. The only thing that’s different about the music industry is that BitTorrent has sped the process up: prices have dropped faster because in addition to competing with new entrants, labels are also “competing” with pirated copies of their own content.

But that’s just a transitory phenomenon. The long-run trend is that there’s going to be a much larger eco-system of free music, just as the blogosphere is a large eco-system of free punditry. And in that environment, business models that rely on content being expensive are doomed, just as Craig’s List doomed newspapers built on the premise of expensive classified advertising. I think Mike is probably right that implementing a de facto music tax would have the effect of cementing in place an increasingly anachronistic industry structure.

With that said, a music tax would have some short-term benefits. An effective collective licensing scheme would create a much more fertile environment for entrepreneurs to build innovative technologies on top of peer-to-peer technologies, so maybe a music tax is a price worth paying for the benefits of a peer-to-peer friendly legal environment. But before I get behind the idea, I’d want to see a clear explanation of how such an agreement would apply to other types of media, and what the long-term evolution of the industry would be.

On Government Transparency

by on December 15, 2008 · 9 comments

The video of last week’s Cato policy forum can be viewed here. (Check out TLFer Jerry Brito’s fine presentation.)

If your preference is for a briefer taste of the transparency issues, a podcast with Ed Felten recorded that day is here:

I’ve been reading some of Larry Lessig’s thoughts on corruption and I’ve drafted a short reaction at OpenMarket.org.

In short, I think that Lessig’s right to say that Washington is corrupt, he’s right that money has an incredible power to corrupt the system, but I think he’s wrong to say that we ought to focus on money.

Why?  Because there are other forms of influence that special interests can use to push lawmakers toward the policies they would prefer.  Eliminating money from politics is likely an impossible goal but would also do little to stop corruption.  Taking away power from government and returning it to individuals seems to me to be the only way we can truly fight corruption.  I articulate this all more fully in the post.

Big news in these parts.

The celebrated openness of the Internet — network providers are not supposed to give preferential treatment to any traffic — is quietly losing powerful defenders.

Google Inc. has approached major cable and phone companies that carry Internet traffic with a proposal to create a fast lane for its own content, according to documents reviewed by The Wall Street Journal. Google has traditionally been one of the loudest advocates of equal network access for all content providers.

TLFers and commenters: Go.