Witness the majesty of the internet: Less than two months ago I blogged on this site about an idea to build a website to crowdsource the task of rating the slew of “shovel ready” projects proposed by localities. I asked for volunteers to help develop the site, and to my amazement, it worked. With the help of all-around heroes Peter Snyder and Kevin Dwyer, today we launch StimulusWatch.org.

Stimulus Watch looks at the 10,000+ projects listed in the U.S. Conference of Mayors’ “MainStreet Economic Recovery Report.” The mayors and local officials around the country have asked that these projects be funded with federal money. Here are some of the proposed projects of interest to readers of this blog:
Once the stimulus bill passes, however, not every project will be funded. The agencies that administer the federal grant-making programs, which Congress will fund through the American Recovery and Reinvestment Act, will have to decide which of these projects to fund.
Continue reading →
As I am getting ready to watch the Super Bowl tonight on my amazing 100-inch screen via a Sanyo high-def projector that only cost me $1,600 bucks on eBay, I started thinking back about how much things have evolved (technologically-speaking) over just the past decade. I thought to myself, what sort of technology did I have at my disposal exactly 10 years ago today, on February 1st, 1999? Here’s the miserable snapshot I came up with:
- 10 years ago today, I did not own a high-definition television set, as they were too expensive (I bought my first one from Sears on an installment plan a few months later. It was a boxy 42-inch, 4×3 monstrosity that rolled around on the floor on casters and it took up half the room). Moreover, only a few HDTV signals could be picked up locally and none were yet available from my cable or satellite provider.
- 10 years ago today, the biggest television in my house was a 32-inch 4×3 ProScan analog set, which I thought was massive. (Of course, it was in terms of weight. It was over 125 lbs).
- 10 years ago today, I was still using a dial-up, 56k narrowband Internet connection even though I lived in downtown Washington, DC just 6 blocks from our nation’s Capitol.
- 10 years ago today, my computer was a Compaq laptop that weighed more than my dog, had barely any storage or RAM, and had a screen that was only slightly brighter than an Etch-A-Sketch.
- 10 years ago today, I was still occasionally using an old CompuServe e-mail address that had nine digits in it. (But at least I wasn’t one of the 20 million or so people paying $20 bucks per month to graze around inside AOL’s walled garden!)
- 10 years ago today, I was still backing up files on 3 1/2 inch floppy disks. I had boxes full of those things. (And, sadly, I still had 5 1/4 inch floppies in my possession that I was saving “just in case” I ever needed those old files. Pathetic!)
Continue reading →
Slashdot user “flyingsquid” suggests why Blizzard has had such a winning streak in federal copyright cases:
You, know, this could just be a coincidence, but a couple of weeks ago I was in Northrend and I ran into an orc named “JudgeCampbell”. He had some pretty sweet weapons and armor he was showing off, including a Judicial Robe of Invicibility and a Judge’s Battle Gavel of The Dragon, which did an unreal amount of damage. Also, he had all these really powerful spells I’d never even heard of before, such as “Contempt of Court” and “Summon Bailiff”. To top it all off, he had like 200,000 gold. I asked where he’d gotten all this stuff and he said he’d just “found it all in some dungeon”. It sounded kind of fishy to me, but I didn’t think anything much of it at the time.
I demand an investigation!
Russ Roberts’s excellent EconTalk podcast had an especially good episode last week as he had Eric Raymond of “The Cathedral and the Bazaar” fame on his show. ESR does a great job of explaining the economics of free software. And he offers a take on the network neutrality debate that is more reflexively hostile to the telcos than I think is justified, but that nonetheless gets the big points right: network neutrality is important, but government regulation isn’t a good way to protect it. He discusses his views in more detail here.
One minor quibble I had with ESR’s presentation: he distinguished Wikipedia from free software projects by saying that software could be judged objectively (either it works or it doesn’t) while editing Wikipedia is an inherently subjective activity. He suggested that for this reason, Wikipedia doesn’t work as well as free software. I think this ignores the central role of verifiability in Wikipedia’s editing process. The truth may be a matter of opinion, but it’s usually not a matter of opinion whether reliable sources have or haven’t made some claim. And as long as most of the reliable sources agree, which they generally do, it’s possible for an impartial observer to compare a sentence in Wikipedia with the corresponding sources and see if the sentence is a fair summary of the source.
Of course, this doesn’t work in every circumstances. Some topics are so intensely controversial that there is wide divergence among reliable sources, or sharp disagreement about which aspects of a topic to focus on. There’s just no getting around the fact that the Wikipedia articles on George W. Bush or abortion are going to be the subject of perpetual edit wars for years to come. But these articles are a relatively small fraction of what Wikipedia does. There are lots and lots of topics that are not especially controversial, and in those context Wikipedia’s decentralized editing process converges on the “right” answer (as judged by comparison to reliable sources) remarkably quickly.
On the flip side, it’s worth remembering that the free software movement has had a few bitter rivalries of its own over the years. Most of the time, free software converges on a reasonable answer and people walk away happy. Sometimes they don’t. Both free software and Wikipedia work astonishingly well most of the time.
Julian Sanchez at ArsTechnica delivers some unsettling news about the state of free speech in America’s education system:
A federal court has rejected a former student’s First Amendment suit against school officials who punished her for calling them “douchebags” in a LiveJournal post. Right now, the scope of student rights to online speech is anything but clear.
This case centers around Avery Doninger, a grad of Lewis S. Mills High School, who called school administrators “douchebags” on her LiveJournal blog. Why? Because of the “possible cancellation of a repeatedly-postponed student concert,” according to Sanchez. Avery, a student council member, was barred from running for reelection because she dropped this D-bomb.
The Supreme Court has wrangled with the issues of campus speech codes in the past and has drawn some unclear lines—at least to this untrained, non-lawyerly mind—about where free speech begins and ends for students. Sanchez speak to this point as well, explaining the federal court’s difficulty with this decision:
Citing the blurry line between “on-campus” and “off-campus” speech in the Internet era, the court acknowledged that current law gives no clear answers to the question of where students’ rights to free online speech end and the authority of schools to enforce discipline begins.
It seems to me that the line should be clear. If you’re at school, you follow the rules. If you’re at a school event, like a football game or a debate tournament (that would have been me in school) then you follow the rules. But, if you’re on your blog at home, you get to say whatever the hell you want.
Continue reading →
The Washington Post reports that the Obama administration is delaying the Bush Administration plan to require federal contractors to use the E-Verify worker background check system.
Criticizing the move, Lamar Smith (R-TX), ranking minority member on the House Judiciary Committee says, “It is ironic that at the same time President Obama was pushing for passage of the stimulus package to help the unemployed, his Administration delayed implementation of a rule designed to protect jobs for U.S. citizens and legal workers.”
E-Verify may well have been designed or intended to protect jobs for citizens and legal workers, but that’s not at all what it would do. I wrote about it in a Cato Policy Analysis titled “Electronic Employment Eligibility Verification: Franz Kafka’s Solution to Illegal Immigration” (a ten-year follow-on to Stephen Moore’s “A National Id System: Big Brother’s Solution to Illegal Immigration“):
A mandatory national EEV system would have substantial costs yet still fail to prevent illegal immigration. It would deny a sizable percentage of law-abiding American citizens the ability to work legally. Deemed ineligible by a database, millions each year would go pleading to the Department of Homeland Security and the Social Security Administration for the right to work.
Even if E-Verify were workable, mission creep would lead to its use for direct federal control of many aspects of American citizens’ lives. Though it should be scrapped, the longer E-Verify is delayed the better.
In at least two recent stories, the mainstream press are highlighting Obama administration slow-walking on transparency.
Bloomberg recently filed suit against the Fed under the Freedom of Information Act to force disclosure of securities the central bank is taking as collateral for $1.5 trillion of loans to banks.
“The American taxpayer is entitled to know the risks, costs and methodology associated with the unprecedented government bailout of the U.S. financial industry,” said Matthew Winkler, the editor-in-chief of Bloomberg News, a unit of New York-based Bloomberg LP . . . .
And here’s what President Obama said in his day-one memorandum on FOIA:
Continue reading →
In the summer of 2000, while I was in college, I moved into a big house with 6 other guys. DSL was just coming on the market, and we were big nerds, so we decided to splurge on fast Internet access. Back then, “fast Internet access” meant a blazing fast (Update: 512k) DSL connection. We had to pay the phone company about $65/month for the line. And we paid our Internet Service Provider $55/month for the connectivity and 8 static IP addresses (thanks to local loop unbundling these were separate services). For $120/month we got to live in the future, enjoying connectivity 10 times faster than the 56k modems that almost everyone had at the time.
Adjusting for inflation, $120 of 2000 money is about $140 of 2009 money. So I was interested to see that St. Louis, MO, where I lived until recently, is about to get 60 mbps Internet service courtesy of Charter, the local cable monopoly. Had I stayed in St. Louis for another year, I would have been able to get 120 times the bandwidth for the same inflation-adjusted cost as the broadband access I had less than a decade ago.
It has almost become a cliche to lament the dismal state of America’s broadband market. There do seem to be countries that are doing better than we are, and we should certainly study what they’ve done and see if there are ideas we could adapt here in the states. But I also think a sense of perspective is important. I can’t get too upset about the possibility that in 2018 Americans might be limping along with 2 gbps broadband connections while the average Japanese family has a 20 gbps connection.
Reminder: Next Wednesday, February 4th, the Cato Institute will host a book forum on David Post’s new book, In Search of Jefferson’s Moose: Notes on the State of Cyberspace.
Comments will come from Clive Crook, Chief Washington Commentator of the Financial Times and Senior Editor of
The Atlantic Monthly; and Jeffrey Rosen, Professor of Law at The George Washington University and Legal Affairs Editor of The New Republic.
It’s a very interesting book, and the commentators are second to none.
Register here.
And here’s Adam’s review of the book.
On the first full day of the new Obama administration, I wrote here, and later followed up, expressing regret that the Obama White House hadn’t ported the “Seat at the Table” program over from the transition. Change.gov published documents submitted to the transition on its Web site for public review and comment. Whitehouse.gov does not.
Now we learn that the White House will not honor an Obama campaign and Whitehouse.gov pledge – not more than nine days old – to post all non-emergency legislation on the White House Web site for five days before the President signs it.
One significant addition to WhiteHouse.gov reflects a campaign promise from the President: we will publish all non-emergency legislation to the website for five days, and allow the public to review and comment before the President signs it.
President Obama signed the “Lilly Ledbetter Fair Pay Act of 2009” into law today, one day after Congress delivered it to him. And there’s the bill law, posted on Whitehouse.gov for public review. But it sure hasn’t been up for five days. And it’s not emergency legislation: Bills like it have been floating around in Congress since at least June 2007.
If I was a little demanding about transparency from day one, it was a bit of counterpoint to folks who were going dewy about Obama’s transparency promises. Those were simply words. Judging by the Whitehouse.gov screen cap below, transparency got thrown over the side for a photo op. Welcome to Washington.
Update: Just got an email that helps illustrate why the sound practices of letting legislation cool and taking public comment would go by the wayside. Getting credit from the ACLU is much more important than pleasing the relatively tiny coterie of transparency fans – and there is almost no expectation among the public that a White House should practice good lawmaking hygiene.