I knew it couldn’t be that easy. The TGDC rejected NIST’s proposal (which I discussed on Friday) to decertify paperless e-voting machines after they couldn’t get the 8 votes they needed to approve it:

Committee member Brit Williams, a computer scientist who has conducted certification evaluations of Georgia’s paperless electronic voting system, opposed the measure. “You are talking about basically a reinstallation of the entire voting system hardware,” he said.

Mike Masnick points out how ridiculous this is:

Why yes. Yes we are. That’s because the entire voting system hardware is totally screwed up. So, to be more specific, we’re talking about stopping an e-voting program that has serious problems and has raised plenty of legitimate questions about just how fair and accurate our elections are. That seems like a perfectly valid reason that shouldn’t be tossed aside just because it’ll be a lot of work. We also thought that democracy itself was supposed to be hard work, but apparently some of those on the Technical Guidelines Committee disagree.

HONG KONG, December 4, 2006–Global regulators had a mixed message for the telecommunications industry here on Monday: Governments should ease restrictions on companies in the presence of competition–but otherwise tighten them.

At the opening forum at Telecom World 2006, government officials from China, Hong Kong, the European Commission and International Telecommunications Union delivered a similar message, but with varying degrees of specificity.

Vivian Reding, the European commissioner responsible for information society and media, was the most direct: “Competition and open markets drive investment and innovation. Monopolies don’t.”

Reding and the other regulators spoke here at the triennial conference of the ITU, a hybrid international body based in Geneva, Switzerland, that is part of the United Nations. The union represents telecommunications companies as well as U.N. member states.

Continue reading →

I’ve written at Cato@Liberty before about how Web 2.0 business models, particularly Google’s, are in conflict with current Supreme Court privacy cases denying people a Fourth Amendment interest in information they have entrusted to third parties.

Now comes a very interesting Information Week report on last month’s Web 2.0 Summit:

None other than Google–which has profited enormously from the data users submit to its services and from the data its users generate through use of its services–is thinking seriously about how to give users more control over their data. Though stopping short of a complete data emancipation proclamation at the Web 2.0 Summit, CEO Eric Schmidt said, “The more we can let people move their data around . . . the better off we’ll be.”

And the better off users’ privacy will be.

Nike+iPod = surveillance?

by on December 4, 2006

I’m a happy user of the Nike+iPod Sport Kit. It’s an add-on for iPods that tracks your running: how far, how long, pace, calories burned, etc. It also lets you track your progress toward a goal or challenge other Nike+iPod users to races. It works by paring a radio receiver attached to your iPod and a radio transmitter placed in your shoe.

However, as those of us who follow such things know, there’s nothing that perks up the ears of privacy activists more than the words “radio transmitter” and “shoe” in the same sentence. Their ears must be at their perkiest as researchers at the University of Washington have issued a report claiming that the Nike+iPod kit can be used to track its wearer. Wired News reports in its usual alarmed tone,

If you enhance your workout with the new Nike+iPod Sport Kit, you may be making yourself a surveillance target.

A report from four University of Washington researchers to be released Thursday reveals that security flaws in the new RFID-powered device from Nike and Apple make it easy for tech-savvy stalkers, thieves and corporations to track your movements. With just a few hundred dollars and a little know-how, someone could even plot your running routes on a Google map without your knowledge.

Below the fold I’ll explain why there are no security “flaws” and you shouldn’t be worried if you own one of these devices.

Continue reading →

The Seattle Times has a fascinating article that nicely illustrates the inefficiencies of central planning:

It’s worth noting just how complex Vista became. BusinessWeek estimates it took 10,000 employees about five years to ship Vista.

In an interview with Microsoft Chief Executive Steve Ballmer a few weeks ago, I asked if he had added up how much money it cost to develop Vista. He laughed, “I can’t say I have. It would be impossible to count up. … I’m sure it’s a lot.”

If we assume Microsoft’s costs per employee are about $200,000 a year, the estimated payroll costs alone for Vista hover around $10 billion. That has to be close to the costs of some of the biggest engineering projects ever undertaken, such as the Manhattan Project that created the atomic bomb during World War II. And while Microsoft toiled on Vista, its stock price stayed flat.

Continue reading →

Eight months. That’s how long it has been since AT&T and BellSouth asked the FCC for permission to merge. Although the merger has since been OK’d by the Department of Justice, and by 18 state regulatory commissions, the Commission has yet to act. It’s not that Chairman Kevin Martin hasn’t tried to get this issue decided: three times in the past two months a vote has been scheduled, only to be put off. (The last delay being just a few days before the mid-term election).

The problem is that the Commission is deadlocked–two members supporting the merger, and two opposing it, reportedly insisting that net neutrality and other conditions be imposed.

Putting two and two together, you get four. But wait–the FCC has five members. The fifth, as it turns out is Robert McDowell, the newest member of the commission. McDowell, however, has been recused from the issue, since he previously worked for CompTel, a trade group that opposed the merger.

In an unusual move, Chairman Kevin Martin has asked the FCC’s general counsel to allow McDowell to vote anyway. Such a step would be unusual, but not unprecedented–for instance Democratic chairman Bill Kennard was allowed to vote on an issue in 2000 on which he was otherwise recused. This request isn’t unprecedented–Democratic chairman Bill Kennard, for instance, was allowed to vote in 2000 on a media issue despite the fact that he previously represented broadcasters. Martin argues that, given the stalemate on the merger, McDowell’s vote is necessary to break the logjam.

Conflict of interest rules, of course, shouldn’t be tossed away lightly. Commissioners after all may be prejudiced in favor of the side they used to work for. But this case has an unusual twist: All indications are that despite his previous employers’ position, McDowell would support the merger. Rather than vote with his old employers, McDowell would likely vote against them. There’s little chance that McDowell would be motivated to vote against the merger because of a salary he drew from folks on the other side.

Martin’s request is both bold and sensible. Hopefully, the GC will approve it, and the FCC will finally vote on this long-pending merger.

It looks as if now that national cable franchise reform is dead in Congress, the FCC is moving forward with its proceeding on the issue. According to USA Today, “Federal Communications Commission Chairman Kevin Martin has proposed rules to make it easier for phone companies and others to jump into the video business.” According to the newspaper’s sources, the new rule would require localities to rule within 90 days on competitive franchise applications by phone companies and others with existing access to public rights-of-way. In a new article in the Journal on Telecommunications & High Technology Law (and a public interest comment), Jerry Ellig and I tell the FCC not only that they should preempt unreasonable local franchise practices, but how they can do so. One of the points we make is that while requiring localities to act expeditiously in making franchise rulings, that’s just a start. The FCC also has the power to curb unreasonable denials of franchises.

In our paper we calculate the cost of franchising to consumers, and it looks like the FCC has such costs in mind. According to the USAT article, “Martin is using the FCC’s upcoming annual report on cable TV prices as ammunition. FCC officials say the report shows that satellite TV and cable TV operators have settled into a cozy duopoly, keeping prices in a steady, upward climb. It shows the average price of cable TV in 2005 was $43.33 a month. Where satellite TV also was available, the average was $43.34. But in markets with another “wired” video provider, the price was dramatically less: $35.94. The upshot: Absent credible land-based rivals, cable TV prices will keep going up.”

I’m very excited to welcome Drew Clark to the Tech Liberation Front as a regular contributor. Drew is a friend to all of us here at the TLF and is well-known in technology policy circles as one of the finest writers about the Digital Economy issues we cover here. Many of you will be familiar with his past work as a senior writer at National Journal’s Technology Daily, which is required reading for tech policy wonks. And you’ve probably seen some of his “Wired in Washington” columns on occasion, too. Or his frequent freelance work for major papers like The Washington Post or Slate. Or even his excellent chapter on “How Copyright Became Controversial” for a book I edited a few years ago. (OK, so you probably didn’t see that book, but at least make sure to read Drew’s chapter!)

Today, Drew is a Senior Fellow and Project Manager at the Center for Public Integrity. He has headed the Center’s “Well Connected” Project on telecommunications and the media since August 2006 and is responsible for the re-launch of the Center’s “Media Tracker” service. The Media Tracker is a free Internet database that allows Americans to see who owns the media and communications networks in their city and ZIP code and also allows users to examine the political contributions and lobbying expenditures by almost 300 telecom, media and technology companies.

Drew also blogs on his own Web site, www.drewclark.com and at the Center for Public Integrity’s “Telecom Watch Blog.” We are very excited he’ll also be sharing his thoughts with us here on the TLF.

Welcome aboard Drew!

Over at CNet News.com today, Daniel Terdiman reports that “IRS taxation of online game virtual assets [seems] inevitable”:

That’s because game publishers may well in the not-too-distant future have to send the forms–which individuals receive when earning nonemployee income from companies or institutions–to virtual world players engaging in transactions for valuable items like Ultima Online castles, EverQuest weapons or Second Life currency, even when those players don’t convert the assets into cash. Most governments are only beginning to become aware of the substantial economic activity in online games, but the games’ rapid growth and the substantial value of the many virtual assets changing hands in them is almost certain to bring them into the popular consciousness. “Given growth rates of 10 to 15 percent a month, the question is when, not if, Congress and IRS start paying attention to these issues,” said Dan Miller, a senior economist with the Congress’ Joint Economic Committee, who is also a fan of virtual worlds. “So it is incumbent on us to set the terms and the debate so we have a shaped tax policy toward virtual worlds and virtual economies in a favorable way.”

My problem with all this is not just that I am a rabid, anti-tax libertarian. It’s that we’re putting the cart before the horse in the sense that we haven’t even figured out what sort of governance structures will be imposed within most of these virtual worlds yet. Despite that, we’re already having a discussion about how “Meat Space” (tangible world) taxes should be to applied cyberspace worlds. Sounds like old fashion “taxation without representation” to me.

We first need to figure out a lot of other basic things about virtual world governance before rushing to impose real world taxes. What sort of property rights will apply? What about copyrights? (See my previous essay on that issue here). How will contracts be enforced? Etc, etc. And, to the maximum extent possible, these things should be decided by the Net-izens living in those virtual worlds before any Congress critters or IRS bureaucrats try to impose taxes on virtual worlds they likely have never even visited.

I recently received a pair of reports on critical infrastructure protection in the mail, and have now had a chance to read them. Both are written by Kenneth Cukier, reporter for The Economist. They are well-written, thought-provoking, balanced, and blessedly brief. They summarize a roundtable and a working group convened by an organization I had not heard of before called The Rueschlikon Conference.

One is called Protecting Our Future: Shaping Public-Private Cooperation to Secure Critical Information Infrastructures. The other is Ensuring (and Insuring?) Critical Information Infrastructure Protection. They focus on an important question: How do we make sure that the facilities of our networked economy and society survive terrorists acts and natural disasters?

I want to come back to the ‘compliment’ I gave both papers: “balanced.” The first report finds, among other things, that we should “harness the power of the private sector” and “use market forces” to protect critical information infrastructures. It notes that Wal-Mart had 66% of its stores in the region of Hurricane Katrina back in operation 48 hours after the storm. It also notes how, with electrical lines downed by Katrina, BellSouth’s backup generators had kicked in. When fuel supplies ran low, government officials confiscated the fuel being trucked in to keep them running. Yet, for reasons I cannot discern, the report maintains that “public-private cooperation” is what’s needed rather than getting the public sector out of the way.

The second report finds that the marketplace is insufficient to protect critical infrastructure because it lacks proper incentives. It also finds that the insurance industry can create a market for security. It’s got to be one or the other. The “balance” of these reports becomes more and more just contradiction.

A telling line can be found in the second report: “[O]ne person expressed skepticism that relying on the market to solve [critical information infrastructure] security would work, since it seemed to fall too neatly into the modern ideological mantra that markets solve all problems.” In other words, a conclusion in favor of market solutions was avoided because it might further validate markets as a problem solving tool. The uncomfortable seeking after balance in these otherwise good reports may reflect an ideological preference for government involvement–despite the harm that did in the case of Hurricane Katrina.

It is insufficient, of course, to identify ideological bias (or anti-ideological bias?) in the reports. I did find them useful and interesting, and they inspired a few thoughts that I think deserve more exploration: 1) Anti-trust law thwarts communication among companies responsible for infrastructure protection. Rather than convening so many government work-groups, the root of the problem in anti-trust law should be addressed. 2) Government secrecy is one of the things undoubtedly keeping the insurance industry from having the confidence to insure against terrorism risk. Thus, it does not promulgate better terror-security practices among its insureds, and a valuable tool in the struggle against terrorism lies on the shop floor. Rather than subsidies, the government should give the insurance industry information. 3) People interested in these issues should attend or watch Cato’s upcoming forum on John Mueller’s book Overblown: How Politicians and the Terrorism Industry Inflate National Security Threats, and Why We Believe Them.