November 2007

Courts and commentators often claim that copyright policy strikes a delicate balance between public and private interests. I see copyright policy in a different pose, however. I see it wobbling precariously, tipping over, and falling into statutory failure. What has put copyright on such unsure footing? The brutish prodding of special interests. Rather than “delicately balanced,” then, I describe copyright policy as “indelicately imbalanced.”

Continue reading →

Presidential candidate Ron Paul (R-Texas) became the “Internet” candidate this month when 36,672 people contributed more than US$4 million online to his campaign in a single 24-hour period. This impressive feat demonstrates the power of an open source culture, a lesson that should not be lost when it comes to other important issues.

The campaign to raise money for Rep. Paul was open source in a number of ways. First, it was a decentralized effort, promoted by people all over the country simultaneously. Indeed, Paul’s campaign was so hands-off that the candidate told The New York Times that he “had nothing to do with it.” It was two independent people who started the ball rolling.

James Sugra posted an online video proposing a big day of fund-raising for Paul, and Trevor Lyman separately created a site, www.thisnovember5th.com, that featured the video. Lyman’s site is now planning another big day on Dec. 16, the anniversary of the Boston Tea party.

On that day, Paul’s open source campaigners are hoping to encourage 100,000 people to donate $100 each.

Choosing a historical day may not be a particularly new fundraising tactic, but the additional open source cultural spin is that the site is automatically updating how many people have pledged so far. This transparency complements the home page of Ron Paul’s Web site, which constantly pops up names of his campaign donors. Those revelations stand in direct contrast to traditional campaigns, which tend to be silent and proprietary about who is donating.

Paul’s “donation feed” is reminiscent of the somewhat addictive “newsfeed” on social networking site Facebook,and it appears to have the effect of increasing donations. In a society where privacy is shrinking, it seems many embrace the idea of sharing more information, not less. Paul’s supporters are not alone in their recognition of the power of a voluntary open source culture.

Internet giant, Google, announced this week that it is offering $10 million in prizes for people who build the best software for Android, the company’s new open platform for mobile devices. This move shows that Google knows its tech history. Back in 1985, Apple made a huge mistake of saying no to a young Bill Gates who wanted the company to open up its proprietary architecture to developers. We all know how that ended, and now a similar story is likely to play out if the big phone companies stay closed while Google opens things up.

This reality, unfortunately, has led many to the erroneous conclusion that since openness is good, the government should force it, no matter what the cost. However, government force rarely leads to the open societies people seek. Take Net neutrality advocates, for example.

[…]

Read more here.

Comcast’s decision to limit Internet traffic from the peer-to-peer software BitTorrent would be against the law if Democratic presidential hopeful Barack Obama had his way, an aide to the Democratic Senator said Thursday.

In a conference call organized by the campaign for Sen. Obama, D-Ill., high-profile technology experts Lawrence Lessig, Beth Noveck and Julius Genachowski endorsed the technology and innovation agenda that Obama released on Wednesday. Also on the lines were three Obama aides, who declined to speak for attribution.

“What I find compelling about the Senator’s [stance] is a strong commitment to Net neutrality,” said Lessig, a law professor at Stanford University, referring to the notion that broadband providers be barred from favoring business partners with speedier Internet delivery.

Obama “addresses the problem of Net neutrality in a way that could actually be enforced,” said Lessig. By contrast, Democratic hopeful Hillary Clinton “can’t stand up for Net neutrality.”

Continue reading →

When and how does ICT interoperability drive innovation? This is the subject of a new paper on interoperability by the Harvard Berkman Center for Internet & Society (the webcast of yesterday’s launch event at the Reagan Building is now available).

Co-authors Urs Gasser and John Palfrey have published a thoughtful and well-balanced study. There’s a lot to agree with, especially their essential conclusion: that interoperability is important for innovation in the IT sector and the market, not government, is the preferred mechanism for achieving interoperability.

But I also think this paper achieves something more, even if unintentionally. It helps debunk the rhetoric we’re hearing about "openness" (and there are many definitions) as the best and only way to achieve interoperability.

First of all, according to the paper, "interoperability is not an unqualified good and is not an end in itself." Furthermore, just because interoperability is not present doesn’t mean there’s a "market failure" — the authors cite DRM-protected music distribution and the growing shift toward unprotected music as a response to interoperability concerns voiced by consumers.

Importantly, the paper identifies that interoperability can be achieved by multiple means: IP licensing, APIs, standards (including "open" standards), and industry consortia.

As it affects innovation, interoperability can help some types of innovation, especially incremental innovations. But higher levels of interoperability may diminish incentives for radical innovations if the network effects of interoperable systems increase switching costs for consumers.

Continue reading →

Earlier this week, FCC chairman Kevin Martin announced long-promised revisions to America’s media-ownership rules. As I point out in my latest essay for the City Journal, the results were extremely disappointing and could have grave consequences for the long-term viability of struggling media operators.
______________________________

Media Deregulation Is Dead
The FCC’s toothless reforms are a victory for the status quo.
November 15, 2007
by Adam Thierer

This week, Federal Communications Commission chairman Kevin Martin announced long-promised revisions to America’s archaic, convoluted media-ownership rules. The result: no serious deregulation, just tinkering at the margins. In fact, of the half-dozen rules currently on the books, Martin is proposing to revise only one—the newspaper/broadcast cross-ownership rule. “No changes to the other media-ownership rules [are] currently under review,” Martin’s press release notes tersely, leaving many TV and radio broadcasters wondering when they will ever get regulatory relief.

Continue reading →

FISA Bill in the House

by on November 15, 2007 · 0 comments

Glenn Greenwald reports that the House will be bringing the RESTORE Act up for a vote today. I wasn’t thrilled with this legislation last time it was brought up for a vote, but there have apparently been enough improvements made to convince Rep. Holt to vote for it, which is a good sign. It’s certainly much better than the horrible legislation in the Senate version, and crucially it includes no immunity for telecom providers.

Here are some good comments by Rep. Lloyd Doggett of Texas:

Now might be a good time to call your Congresscritter and let him or her know how you feel about warrantless surveillance and telecom immunity.

Update: Just to be clear, the suggestion that you call your member of Congress is a personal recommendation, and shouldn’t be construed as the position of any organizations with which I might be affiliated.

When Google Privacy Counsel Peter Fleischer introduced the company’s call for global privacy standards, I thought he mangled some basic concepts. He’s not the first, and others have gotten it worse – and more threateningly so – since.

But I’m impressed with the general tenor of his recent comments encouraging a focus on preventing consumer harm. Many in the privacy community are deeply wedded to “Fair Information Practices” – a varying set of rules that, followed by rote, would allegedly take care of privacy. Well, they don’t. They produce a lot of churn, and they soak up a lot of energy with regulation, compliance efforts, and what-have-you. But they don’t address what matters: protection of actual privacy and prevention of consumer harm.

“FIPs” aren’t all bad. Some of them are good. Some conflict with others. They’re just not a helpful framework for addressing the problems presented to us by the information age.

Last year, the DHS Privacy Committee produced a document unpacking the human values that matter (generally referred to as “privacy”). The focus should be on how information practices affect privacy and related values – chiefly, whether modern information practices cause people harm.

In the recent Verizon Uprisin’ (successor to the Comcast Kerfuffle – how’m I doin’?), the blogospheric back-and-forth between TLFer Tim Lee (writing at TechDirt) and TLFriend Ed Felten illustrates nicely the difficulty with both parts of the case for ‘net neutrality regulation.

The first question is whether there is a problem that needs solving. The two disagree about whether Verizon’s operation of its DNS servers is a ‘net neutrality violation at all.

The second question is whether the problem is better solved by regulation or by market processes (expert agitation, consumer pressure, etc. that carry with them the threat or reality of lost customers and profits). As a technical matter, Tim points out that people are free to point their computers to another DNS server, such as OpenDNS. Ed says “it might turn out that the regulatory cure is worse than the disease.”

Even among those who disagree on whether there’s a substantive ‘net neutrality violation here, regulation doesn’t seem to be the cure. Even Harold Feld hasn’t written a triumphal post. (Though, in fairness, he seems to be distracted – and oh so giddy – about cable regulation.)

[James and Hance already commented on this issue, but here’s my take on the FCC opening another front in its ongoing “war on cable” … ]

Despite steadily increasing video competition and consumer programming choices, the Federal Communications Commission (FCC)–or at least current Chairman Kevin Martin–seems to be pursuing what many journalists and market analysts have described as a “war on cable.” As Craig Moffett, a senior analyst with Sanford Bernstein & Co, says, “Over the past year, the Chairman has adopted an almost uniformly anti-cable stance on issues ranging from set-top boxes (CableCards), digital must carry requirements, cable ownership caps, video franchising rules, and the abrogation of exclusive service contracts with [apartment owners].”

And Moffett is only summarizing the economic regulation that Martin’s FCC is currently pursuing against cable. Chairman Martin has also proposed the unprecedented step of imposing content controls on pay TV providers. He wants to extend broadcast industry “indecency” regulations to cable and satellite operators, even though the constitutionality of those rules is being questioned in court. And Chairman Martin has also suggested that “excessively violent” programming on pay TV should be regulated in some fashion. Finally, he has strong-armed cable operators into offering “family-friendly tiers” of programming even though there was no demand for them and consumers have shown little interest in them now that they have been offered.

And more cable regulation appears to be in the works. According to recent press reports, Chairman Martin is considering breathing new life into a little-known provision of the Cable Communications Act of 1984 known as the “70/70 rule.” Under the 70/70 rule, if the Commission finds that cable service is available to 70% of households and 70% of those homes subscribe, then the FCC can “promulgate any additional rule necessary to provide diversity of information sources.”

Chairman Martin apparently believes that cable has crossed both 70/70 thresholds and that comprehensive regulation of the cable industry is now warranted. What that means in practice remains to be seen, but it could include common carriage-like price controls on cable systems. The Wall Street Journal reports that a significant reduction (perhaps 75%) in the rates cable operators charge programmers for leased access might be the end result. In the long run, an FCC declaration that the 70/70 rule has been triggered could also lead to the imposition of some of the other regulatory proposals mentioned above.

Continue reading →

SMTP Blocking

by on November 15, 2007 · 17 comments

In response to a post I did on Verizon’s obnoxious DNS policies, a Techdirt reader writes:

Verizon DOES block your ability to use 3rd-party mail servers. GMail is web-based, son. A server at a friend’s ISP, connecting over port 25, is BLOCKED by Verizon, period end of story.

Now, I use another port and so go my merry way, but Verizon, having blocked port 25, can block any ports they wish under the same guiding principle. Verizon sets limits.

And another reader responds:

Isn’t that standard practice? To (somewhat) prevent spoofing email, ISPs require outbound mail to go through in-house servers, but inbound on port 110 can be any source you have access to.

Does anyone know if this is true? I’ve occasionally encountered Wifi connections in hotels or coffee shops that block outbound SMTP, but I’d always assumed that real residential ISPs don’t do that sort of thing. Such a policy does little or nothing to combat spam, but it sure is a pain in the butt for those of us who use real mail clients and don’t use our ISP’s SMTP servers.

Relatedly, would such a policy a violation of network neutrality? It sure seems like it violates the letter of Snowe-Dorgan, which would imply that thousands of annoyingly-configured hotspots would instantly become illegal if network neutrality regs passed.