January 2007

Bret Swanson of The Discovery Institute has an important piece on Net Neutrality in today’s Wall Street Journal entitled “The Coming Exaflood.” He’s refering to the increasing flood of exabyte-level traffic (especially from high-def video) that could begin clogging the Net in coming years unless broadband networks are built out and upgraded to handle it. He states:

“[Net neutrality supporters] now want to repeat all the investment-killing mistakes of the late 1990s, in the form of new legislation and FCC regulation… This ignores the experience of the recent past–and worse, the needs of the future.

Think of this. Each year the original content on the world’s radio, cable and broadcast television channels adds up to about 75 petabytes of data–or, 10 to the 15th power. If current estimates are correct, the two-year-old YouTube streams that much data in about three months. But a shift to high-definition video clips by YouTube users would flood the Internet with enough data to more than double the traffic of the entire cybersphere. And YouTube is just one company with one application that is itself only in its infancy. Given the growth of video cameras around the world, we could soon produce five exabytes of amateur video annually. Upgrades to high-definition will in time increase that number by another order of magnitude to some 50 exabytes or more, or 10 times the Internet’s current yearly traffic.

We will increasingly share these videos with the world. And even if we do not share them, we will back them up at remote data storage facilities. I just began using a service called Mozy that each night at 3 a.m. automatically scans and backs up the gigabytes worth of documents and photos on my PCs. My home computers are now mirrored at a data center in Utah. One way or another, these videos will thus traverse the net at least once, and possibly, in the case of a YouTube hit, hundreds of thousands of times.

Continue reading →

Ordinarily, my software patent series focuses on patents that have been granted by the patent office and the subject of litigation. I’m going to break that pattern this week because reader Richard Bennett pointed to one of his own patent applications as an example of a worthwhile software patent. Since I frequently ask supporters of software patents to point out a good one (a request that’s almost always ignored) I thought I’d analyze Bennett’s patent application to see what we can learn. Below the cut are my thoughts on it.

Continue reading →

Today, the Progress & Freedom Foundation released a “Tech Agenda for 2007” containing ten policy recommendations for the 110th Congress and the FCC. It’s a set of market-oriented proposals covering a wide array of Digital Economy issues. What follows is just a brief summary of the 10 priorities we came up with. Please review the complete study for our detailed recommendations:

1 – Renew fundamental reforms of communications regulations.
2 – Leave network neutrality concerns to the market and antitrust.
3 – Leave content business models and fair use to the market.
4 – When addressing patents, take a first-principles approach to property and innovation.
5 – Enact meaningful reform of archaic media ownerships laws and regulations that hinder media marketplace experimentation.
6- Pursue greater First Amendment parity among modern media providers by leveling the playing field in the direction of greater freedom for all operators / platforms.
7 – Subject data security and privacy proposals to careful benefit-cost analysis, including full examination of consumer benefits from services and technologies affected by these proposals.
8 – Promote pro-competitive, non-regulatory internet governance.
9 – Avoid open-ended, intrusive data retention mandates.
10 – Promote more efficient taxation of telecom services and Internet sales.

I hope every TLF reader is also a Techdirt reader, but in case some of you missed it, I wanted to point out that Mike Masnick is doing a fantastic series on post-scarcity economics. Here’s a taste:

Throw an infinity into the supply of a good and the supply/demand curve is going to toss out a price of zero (sounds familiar, right?). Again, the first assumption is to assume the system is broken and to look for ways to artificially limit supply.

However, the mistake here is to look at the market in a manner that is way too simplified. Markets aren’t just dynamic things that constantly change, but they also impact other markets. Any good that is a component of another good may be a finished good for the seller, but for the buyer it’s a resource that has a cost. The more costly that resource is, the more expensive it is to make that other good. The impact flows throughout the economy. If the inputs get cheaper, that makes the finished goods cheaper, which open up more opportunities for greater economic development. That means that even if you have an infinite good in one market, not all the markets it touches on are also infinite. However, the infinite good suddenly becomes a really useful and cheap resource in all those other markets.

So the trick to embracing infinite goods isn’t in limiting the infinite nature of them, but in rethinking how you view them. Instead of looking at them as goods to sell, look at them as inputs into something else. In other words, rather than thinking of them as a product the market is pressuring you to price at $0, recognize they’re an infinite resource that is available for you to use freely in other products and markets. When looked at that way, the infinite nature of the goods is no longer a problem, but a tremendous resource to be exploited. It almost becomes difficult to believe that people would actively try to limit an infinitely exploitable resource, but they do so because they don’t understand infinity and don’t look at the good as a resource.

Of course, the concern is that information resources only become infinite after the first copy is produced, and that first copy might not be produced absent artificially constrained supply. But I expect Mike will argue that this condition occurs less often than the standard economic model suggests–that people can show surprising ingenuity in finding ways to profit from their intellectual creations even without the benefit of a legal monopoly.

I want to second Jim’s recommendation that you read his Regulation article discussing PFF’s new book on network neutrality regulation. He argues persuasively that each side in the network neutrality debate gives the other too little credit:

It is hard to pin down what exactly the Internet is. There are several versions, with convergence around the idea that the things making up the Internet can be described as a series of layers. At the bottom, there is the physical layer–the wires, cables, and fibers that Internet communications travel over. Next there is the logical layer, the routing rules that send packets of data from origin to destination. Next there is the application layer–the programs that people use to create content and send it from one place to another. (Think of e-mail programs, browsers, and the like.) Finally, there is the content layer. This is the actual material people send to each other in those e-mails, the websites that show up on their screens, and so on.

Continue reading →

Here’s the most interesting claim in the lawsuit filed by parents against MySpace alleging its negligent failure to protect their daughters:

14. Plaintiffs allege and are prepared to show proof that, at all times relevant to the claims alleged herein, said parents were variously too busy, preoccupied, or self-absorbed to attend to their ordinary parenting duties. Alternatively and additionally, the willfullness and independence of their victim children was intimidating and exhausting, for which reason responsibility for defending and guarding the interests of said victims shifted to defendant MySpace.

/satire

Here‘s Harper on PFF on Net Neutrality in Regulation magazine.

My review of the Progress & Freedom Foundation book Net Neutrality or Net Neutering: Should Broadband Services be Regulated (which starts on page 5 of the PDF) takes a sort of “pox on both your houses” approach while concluding that the opponents of public utility regulation for broadand have the better argument.

Here‘s Farber and Katz (with Faulhaber and Yoo) in the Washington Post.

Shame on You, September!

by on January 18, 2007 · 2 comments

I would quote this Ars article, but really, a picture is worth a thousand words:

As the priceless tagline to the story puts it: “Obscenity complaints to the FCC jumped 40,000 percent between August and September 2006. Shame on you, September! Don’t you know that our children are watching?”

What’s happening here, obviously, isn’t that the major networks decided to start running porn during prime time in September. Rather, groups like the Parents Television Council put out an action alert describing the filthiest moments in television they can find, and their members (the vast majority of whom probably didn’t even watch the show) send form letters to the FCC.

A False Analogy

by on January 18, 2007 · 30 comments

Over at IPCentral, Jim DeLong quotes a lengthy critique of the SFLC brief in the Microsoft v. AT&T case. The critique was written by one Greg Aharonian. A lot of it is the kind of legal inside baseball that I’m not really qualified to comment on, but there’s one theme that runs throughout the critique that’s just flatly wrong:

The first lie of Moglen’s brief is a big lie of omission. Nowhere in his brief does there appear the word “hardware”. It is unethical to talk about the patentability of software without simultaneously talking about the patentability of hardware, especially in light of hardware/software codesigns tools. And even using the word “hardware” is pointless unless you provide rigorous definitions of “hardware” and “software”. Moglen doesn’t. So when Moglen bases his software patent hatred on Benson:

“The holding of Benson is properly applicable to all software, because a computer program, no matter what its function, is nothing more or less than the representation of an algorithm.”

as well, he is arguing hardware patent hatred:

“The holding of Benson is properly applicable to all hardware, because a digital circuit, no matter what its function, is nothing more or less than the representation of an [Boolean] algorithm.”

This is silly. I would be very interested to see the boolean algorithm that is equivalent to, say, an LCD panel. Some characteristics of hardware can be described as equivalent to software algorithms, but other aspects (such as, say, the ability to display information to the user) cannot.

Continue reading →