January 2009

Slashdot Explains All

by on January 30, 2009 · 5 comments

Slashdot user “flyingsquid” suggests why Blizzard has had such a winning streak in federal copyright cases:

You, know, this could just be a coincidence, but a couple of weeks ago I was in Northrend and I ran into an orc named “JudgeCampbell”. He had some pretty sweet weapons and armor he was showing off, including a Judicial Robe of Invicibility and a Judge’s Battle Gavel of The Dragon, which did an unreal amount of damage. Also, he had all these really powerful spells I’d never even heard of before, such as “Contempt of Court” and “Summon Bailiff”. To top it all off, he had like 200,000 gold. I asked where he’d gotten all this stuff and he said he’d just “found it all in some dungeon”. It sounded kind of fishy to me, but I didn’t think anything much of it at the time.

I demand an investigation!

Great ESR EconTalk Podcast

by on January 30, 2009 · 11 comments

Russ Roberts’s excellent EconTalk podcast had an especially good episode last week as he had Eric Raymond of “The Cathedral and the Bazaar” fame on his show. ESR does a great job of explaining the economics of free software. And he offers a take on the network neutrality debate that is more reflexively hostile to the telcos than I think is justified, but that nonetheless gets the big points right: network neutrality is important, but government regulation isn’t a good way to protect it. He discusses his views in more detail here.

One minor quibble I had with ESR’s presentation: he distinguished Wikipedia from free software projects by saying that software could be judged objectively (either it works or it doesn’t) while editing Wikipedia is an inherently subjective activity. He suggested that for this reason, Wikipedia doesn’t work as well as free software. I think this ignores the central role of verifiability in Wikipedia’s editing process. The truth may be a matter of opinion, but it’s usually not a matter of opinion whether reliable sources have or haven’t made some claim. And as long as most of the reliable sources agree, which they generally do, it’s possible for an impartial observer to compare a sentence in Wikipedia with the corresponding sources and see if the sentence is a fair summary of the source.

Of course, this doesn’t work in every circumstances. Some topics are so intensely controversial that there is wide divergence among reliable sources, or sharp disagreement about which aspects of a topic to focus on. There’s just no getting around the fact that the Wikipedia articles on George W. Bush or abortion are going to be the subject of perpetual edit wars for years to come. But these articles are a relatively small fraction of what Wikipedia does. There are lots and lots of topics that are not especially controversial, and in those context Wikipedia’s decentralized editing process converges on the “right” answer (as judged by comparison to reliable sources) remarkably quickly.

On the flip side, it’s worth remembering that the free software movement has had a few bitter rivalries of its own over the years. Most of the time, free software converges on a reasonable answer and people walk away happy. Sometimes they don’t. Both free software and Wikipedia work astonishingly well most of the time.

Julian Sanchez at ArsTechnica delivers some unsettling news about the state of free speech in America’s education system:

A federal court has rejected a former student’s First Amendment suit against school officials who punished her for calling them “douchebags” in a LiveJournal post. Right now, the scope of student rights to online speech is anything but clear.

This case centers around Avery Doninger, a grad of Lewis S. Mills High School, who called school administrators “douchebags” on her LiveJournal blog.  Why?  Because of the “possible cancellation of a repeatedly-postponed student concert,” according to Sanchez.  Avery, a student council member, was barred from running for reelection because she dropped this D-bomb.

The Supreme Court has wrangled with the issues of campus speech codes in the past and has drawn some unclear lines—at least to this untrained, non-lawyerly mind—about where free speech begins and ends for students.  Sanchez speak to this point as well, explaining the federal court’s difficulty with this decision:

Citing the blurry line between “on-campus” and “off-campus” speech in the Internet era, the court acknowledged that current law gives no clear answers to the question of where students’ rights to free online speech end and the authority of schools to enforce discipline begins.

It seems to me that the line should be clear.  If you’re at school, you follow the rules.  If you’re at a school event, like a football game or a debate tournament (that would have been me in school) then you follow the rules.  But, if you’re on your blog at home, you get to say whatever the hell you want.

Continue reading →

The Washington Post reports that the Obama administration is delaying the Bush Administration plan to require federal contractors to use the E-Verify worker background check system.

Criticizing the move, Lamar Smith (R-TX), ranking minority member on the House Judiciary Committee says, “It is ironic that at the same time President Obama was pushing for passage of the stimulus package to help the unemployed, his Administration delayed implementation of a rule designed to protect jobs for U.S. citizens and legal workers.”

E-Verify may well have been designed or intended to protect jobs for citizens and legal workers, but that’s not at all what it would do. I wrote about it in a Cato Policy Analysis titled “Electronic Employment Eligibility Verification: Franz Kafka’s Solution to Illegal Immigration” (a ten-year follow-on to Stephen Moore’s “A National Id System: Big Brother’s Solution to Illegal Immigration“):

A mandatory national EEV system would have substantial costs yet still fail to prevent illegal immigration. It would deny a sizable percentage of law-abiding American citizens the ability to work legally. Deemed ineligible by a database, millions each year would go pleading to the Department of Homeland Security and the Social Security Administration for the right to work.

Even if E-Verify were workable, mission creep would lead to its use for direct federal control of many aspects of American citizens’ lives. Though it should be scrapped, the longer E-Verify is delayed the better.

In at least two recent stories, the mainstream press are highlighting Obama administration slow-walking on transparency.

Bloomberg recently filed suit against the Fed under the Freedom of Information Act to force disclosure of securities the central bank is taking as collateral for $1.5 trillion of loans to banks.

“The American taxpayer is entitled to know the risks, costs and methodology associated with the unprecedented government bailout of the U.S. financial industry,” said Matthew Winkler, the editor-in-chief of Bloomberg News, a unit of New York-based Bloomberg LP . . . .

And here’s what President Obama said in his day-one memorandum on FOIA:
Continue reading →

Look Ma, Faster Broadband!

by on January 29, 2009 · 15 comments

In the summer of 2000, while I was in college, I moved into a big house with 6 other guys. DSL was just coming on the market, and we were big nerds, so we decided to splurge on fast Internet access. Back then, “fast Internet access” meant a blazing fast (Update: 512k) DSL connection. We had to pay the phone company about $65/month for the line. And we paid our Internet Service Provider $55/month for the connectivity and 8 static IP addresses (thanks to local loop unbundling these were separate services). For $120/month we got to live in the future, enjoying connectivity 10 times faster than the 56k modems that almost everyone had at the time.

Adjusting for inflation, $120 of 2000 money is about $140 of 2009 money. So I was interested to see that St. Louis, MO, where I lived until recently, is about to get 60 mbps Internet service courtesy of Charter, the local cable monopoly. Had I stayed in St. Louis for another year, I would have been able to get 120 times the bandwidth for the same inflation-adjusted cost as the broadband access I had less than a decade ago.

It has almost become a cliche to lament the dismal state of America’s broadband market. There do seem to be countries that are doing better than we are, and we should certainly study what they’ve done and see if there are ideas we could adapt here in the states. But I also think a sense of perspective is important. I can’t get too upset about the possibility that in 2018 Americans might be limping along with 2 gbps broadband connections while the average Japanese family has a 20 gbps connection.

Post Jeffersons MooseReminder: Next Wednesday, February 4th, the Cato Institute will host a book forum on David Post’s new book, In Search of Jefferson’s Moose: Notes on the State of Cyberspace.

Comments will come from Clive Crook, Chief Washington Commentator of the Financial Times and Senior Editor of The Atlantic Monthly; and Jeffrey Rosen, Professor of Law at The George Washington University and Legal Affairs Editor of The New Republic.

It’s a very interesting book, and the commentators are second to none.

Register here.

And here’s Adam’s review of the book.

On the first full day of the new Obama administration, I wrote here, and later followed up, expressing regret that the Obama White House hadn’t ported the “Seat at the Table” program over from the transition. Change.gov published documents submitted to the transition on its Web site for public review and comment. Whitehouse.gov does not.

Now we learn that the White House will not honor an Obama campaign and Whitehouse.gov pledge – not more than nine days old – to post all non-emergency legislation on the White House Web site for five days before the President signs it.

One significant addition to WhiteHouse.gov reflects a campaign promise from the President: we will publish all non-emergency legislation to the website for five days, and allow the public to review and comment before the President signs it.

President Obama signed the “Lilly Ledbetter Fair Pay Act of 2009” into law today, one day after Congress delivered it to him. And there’s the bill law, posted on Whitehouse.gov for public review. But it sure hasn’t been up for five days. And it’s not emergency legislation: Bills like it have been floating around in Congress since at least June 2007.

If I was a little demanding about transparency from day one, it was a bit of counterpoint to folks who were going dewy about Obama’s transparency promises. Those were simply words. Judging by the Whitehouse.gov screen cap below, transparency got thrown over the side for a photo op. Welcome to Washington.

obama-photo-op

Update: Just got an email that helps illustrate why the sound practices of letting legislation cool and taking public comment would go by the wayside. Getting credit from the ACLU is much more important than pleasing the relatively tiny coterie of transparency fans – and there is almost no expectation among the public that a White House should practice good lawmaking hygiene.

aclu-screencap

The next several days feature a variety of upcoming events, both on broadband stimulus legislation, and on some of the broader issues associated with the Internet and its architecture.

On Friday, January 30, the Technology Policy Institute features a debate, “Broadband, Economic Growth, and the Financial Crisis: Informing the Stimulus Package,”  from 12 noon – 2 p.m., at the Rayburn House Office Building, Room B369.

Moderated by my friend Scott Wallsten, senior fellow and vice president for research at the Technology Policy Institute, the event features James Assey, Executive Vice President for the National Cable & Telecommunications Association; Robert Crandall, Senior Fellow in Economic Studies, The Brookings Institution; Chris King, Principal/Senior Telecom Services Analyst, Stifel Nicolaus Telecom Equity Research; and Shane Greenstein, Elinor and Wendell Hobbs Professor of Management and Strategy at the Kellogg School of Management, Northwestern University.

The language promoting the event notes, “How best to include broadband in an economic stimulus package depends, in part, on understanding two critical issues: how broadband affects economic growth, and how the credit crisis has affected broadband investment.  In particular, one might favor aggressive government intervention if broadband stimulates growth and investment is now lagging.  Alternatively, money might be better spent elsewhere if the effects on growth are smaller than commonly believed or private investment is continuing despite the crisis.”

And then, on Tuesday,  MIT Professor David Clark, one of the pioneers of the Internet and a distinguished scientist whose work on “end-to-end” connectivity is widely cited as the architectural blueprint of the Internet, looks to the future.  Focusing on the dynamics of advanced communications – the role of social networking, problems security and broadband access, and the industrial implications of network virtualization and overlays – Clark here tackles new forces shifting regulation and market structure.

David Clark is Senior Research Scientist at the MIT Computer Science and Artificial Intelligence Laboratory. In the forefront of Internet development since the early 1970s, Dr. Clark was Chief Protocol Architect in 1981-1989, and then chaired the Internet Activities Board. A past chairman of the Computer Science and Telecommunications Board of the National Academies, Dr. Clark is co-director of the MIT Communications Futures Program.

I’m no longer affiliated with the Information Economy Project at George Mason University, but I urge all interested in the architecture of the Internet to register and attend More information about the lecture, and about the Information Economy Project, is available at http://iep.gmu.edu/davidclark.

It will take place at the George Mason University School of Law, Room 120, 3301 Fairfax Drive, Arlington, VA 22201 (Orange Line: Virginia Square-GMU Metro), on Tuesday, February 3, from 4 – 5:30 p.m., with a reception to follow. The event is free and open to the public, but reservations are requested. To reserve a spot, please e-mail iep.gmu@gmail.com

Google has—as I noted it would last June—finally released (PCWorld, Google’s policy blog)  its eagerly-awaited suite of tools available for free (of course) at MeasurementLab.net that allow users to monitor how their ISP might be tweaking (degrading, deprioritizing, etc.) their traffic—among other handy features.  Huzzah!

So, now that we have visibility into traffic management practices on a large scale, remind me again why the FCC would need to mandate “net neutrality” requirements?  Why not just leave the matter up to the FTC to enforce each ISP’s terms of use under the agency’s existing authority to punish unfair and deceptive trade practices?  Won’t the threat of users switching to another broadband provider discipline ISPs’ traffic management?  (As long as ISPs have traffic nationwide traffic management policies, even those users in areas lacking meaningful broadband competition will be protected from discriminatory network management practices by pressure in other markets.)

“If you believe that network neutrality government regulation is not needed, if you believe that the market will handle this … then you should also welcome Measurement Labs,” [Princeton Center for Information Technology Policy director Ed] Felten said. “What you are appealing to is a process of public discussion … in which consumers move to the ISP [Internet service provider] that gives them the best performance. It’s a market that’s facilitated by better information.”

Yes, it’s true (as PCWorld article linked to above points out) that a consumer might not be able to discern whether apparent degradation of their traffic was actually caused by the ISP or whether it might be the result of, say, spyware or simple Internet congestion.  But they don’t need to figure that out for themselves.  Although the relatively small percentage of users who install this tool are likely to be highly sophisticated (at least the early adopters), all they need to is “sound the alarm” about what they think might be a serious violation of “net neutrality” principles, and a small cadre of technical experts can do the rest:  examining these allegations to determine what ISPs are actually doing.  

Sure, there will be false alarms and of course many advocates of “net neutrality” regulation will still insist that ISPs shouldn’t be allowed to practice certain kinds of network management, no matter how transparently the ISPs might disclose their practices.  But the truth will emerge, and in the ongoing tug-of-war between public pressure and ISPs’ practical needs to manage their networks smartly, between the desire of some to have practices disclosed very specifically and the ISPs’ desire to maintain operational flexibility, I suspect we’ll find a relatively stable (if constantly-evolving) equilibrium.  It won’t be perfect, but do we really think government bureaucrats will do a better job of finding that happy medium?