August 2007

Freeing the Journal

by on August 2, 2007 · 0 comments

Relatedly, Ingram has an interesting post on whether Rupert Murdoch should make the Wall Street Journal‘s website available for free:

I know that many newspapers have looked to the Journal as a model for what a paper can do online, because it is one of the few that has charged for its content from the very beginning and built what appears to be a successful business doing so. But does it make sense now? This Wall Street Journal story notes that Murdoch commissioned a study that looked at what going free would mean for the paper, and from that he concluded that while readership would grow by a factor of 10, advertising would likely only grow by a factor of five, and the loss of subscription revenue would effectively make the whole thing a wash. In other words, maybe’s it’s not worth it.

It’s not clear whether the study looked at the short run or the long run, but it seems to me that if the short-run financial outcome is anywhere close to a wash, then it’s stupid to keep the paywall. Because the biggest harm a paywall does is dramatically limit a site’s long-term growth potential. People who currently like the Journal enough to pay for it will likely keep doing so. But given the massive amount of information out there, most people just leaving college are likely to opt for one of the Journal’s free competitors.

Moreover, being free brings a wealth of ancillary benefits that don’t show up in the bottom line right away. As a blogger, I almost never link to stories behind paywalls because I can usually find a free version of the same story. My impression is that other bloggers tend to act the same way. So as the blogosphere becomes an increasingly important source of traffic, paywalls will become more of a liability. If Murdoch can eliminate the paywall and completely replace the lost revenue with ads, that seems like a no-brainer to me.

I first became aware of the massive statistical infrastructure of the U.S. government because much of its data collections have privacy consequences. The Census Bureau, for example, has turned a simple instruction to count people (“[An] Enumeration shall be made within three Years after the first Meeting of the Congress of the United States, and within every subsequent Term of ten Years, in such Manner as they shall by Law direct.” ) into a large organization with lots and lots of different information products.

Recognize that the process of collecting, compiling, and disseminating information is an economic function. This Federal Register notice on federal statistical practices does so, with some interesting spin:

To operate efficiently and effectively, our democracy relies on the flow of objective, credible statistics to support the decisions of governments, businesses, households, and other organizations. Any loss of trust in the integrity of the Federal statistical system and its products could lessen respondent cooperation with Federal statistical surveys, decrease the quality of statistical system products, and foster uncertainty about the validity of measures our Nation uses to monitor and assess its performance and progress.

Without us, you’d be lost! And if federal statistical agencies just disappeared, that would be true. Statistics are an important tool of government and business.

But . . . is it government’s responsibility to develop and deliver statistics to business? Or is that another dimension of corporate welfare? Here in Washington, there are statistics “user” organizations whose mission is to preserve the flow of data from government to business – to collect another set of goodies free – or, most accurately, at the taxpayers’ expense.

In my former life as a lobbyist/consultant, I represented a very cool satellite remote sensing company called Digital Globe. You’ve probably seen their stuff on Google Maps, and TV networks often use their imagery to illustrate news stories. Given my libertarian predilections, I was painfully aware that this company was in competition with a rather substantial competitor, the U.S. government, and in this area, like so many, the competition was hydra-headed.

If the market for geospatial data weren’t already well occupied by government suppliers, Digital Globe and its many competitors would be producing better information products than are available in the government-dominated market, and the costs of doing so would fall where they should – directly on users.

I’m not asking you to be convinced yet, but just think about having the corporate sector pay its own way for the data and statistics it uses.

Via Matthew Ingram (I’m catching up on RSS feeds), Jack Shafer has makes an important observation about today’s prestige newspapers: their staffs are significantly larger than in the glory days of the 1970s. Although Shafer’s original piece overstated the case, the numbers are still significant: The New York Times has apparently grown from 500 reporters and editors to 750, while the Washington Post has grown from 340 to 550. In other words, each is about 50 percent larger than it was in the 1970s.

Shafer then quotes Post and Times officials explaining why news would suffer from a reduction in headcount to 1970s levels. Apparently fluff stories are bigger revenue drivers than they were in the 1970s, so the hard news headcount would have to be cut below 1970s levels to keep the paper profitable.

But I think this dramatically underestimates how much easier a reporter’s job is today than in the 1970s. There’s a wealth of original materials available online that makes fact-checking easier. There’s a massive distributed reporting system called the blogosphere that helps reporters dig up leads and provide instant feedback. There are people all over the place with cell phones capable of capturing photos and even video. There are sites like YouTube and Flickr that help aggregate and organize this wealth of material.

Obviously, there are still some stories where there’s no substitute for picking up the phone and calling sources, or for hopping on a plane to see a story first-hand. We still need some reporters to do that. But the job of a reporter these days is far more oriented toward synthesizing and summarizing the material that’s already out there. Much of the information is already out there, and the job of a reporter is simply to translate sometimes technical source documents into plain English.

Moreover, one of the points Shafer make is that the Times and the Post relied far more on wire stories in the 1970s than they do today. There’s no reason to think there’d be much loss in story quality if reporters did more of this today. The Times could cut its staff covering technology and instead feature content from CNet or Wired (obviously, they’d want to feature some of CNet’s less geeky or esoteric content, but I’m sure CNet would be happy to produce some less-geeky stories to accommodate them). Many large web properties already do this, but there’s every reason to think this process could continue without significantly harming the quality of news coverage.

The next decade may bring wrenching adjustments for reporters used to secure positions at large newspapers, but there’s little reason to think that the quality of news will suffer as a result. Quite the contrary, thanks to the Internet, the average American has access to far more and better news than he did 20 years ago. Any diminution of the quality of newspaper reporter will be small compared with the benefits of being able to choose from dozens of high-quality news sites.

E-Voting in The Hill

by on August 2, 2007 · 0 comments

In The Hill today, Lawrence Nordin and I make the case that the Holt e-voting bill, while far from perfect, would be a step toward more secure elections.

DRM: Not Secure!

by on August 2, 2007 · 0 comments

Alexander Wolfe points out that every DRM system known to man has been cracked. Slashdot seems to think this is news.

Ars reports that Teleflex is beginning to have a real impact on the outcome of software patent litigation:

Friskit filed a patent infringement lawsuit against RealNetworks in 2003 that sought over $70 million in damages. In a ruling issued last week, Judge William W. Schwarzer granted Realnetworks’ motion for summary judgment, citing “Real’s clear and convincing evidence of obviousness.”

Judge Schwarzer cited the Supreme Court’s decision on KSR v. Teleflex in his opinion. “Two principles from the Supreme Court’s recent opinion in KSR Int’l Co. v. Teleflex Inc. guide the analysis of whether sufficient difference exists between the prior art and Friskit’s claims to render the patents nonobvious,” he wrote. The first of those is patents that rearrange old elements to create a new—but obvious—combination. The second comes from situations where a person of “ordinary skill” pursues known options, and the result is the product of “ordinary skill and common sense.”

“All of the individual features of Friskit’s patents which allow a user to easily search for and listen to streaming media existed in the prior art,” noted the judge, who went on to cite a number of media player

Good for Judge Schwarzer. This bodes well for Vonage.

The New York Times has a story on voting reform that suggests an explanation for something that’s puzzled me for a while. One of the consistent patterns you’ll find in the e-voting debate is that state election officials tend to side with e-voting vendors rather than with security experts. This always struck me as a little bit puzzling, because the case against e-voting isn’t that hard to understand, and people who work with these technologies every day, of all people, should be able to understand them.

One explanation is that once a state has chosen a particular voting technology, they get egg on their face if they subsequently have to admit that the technology in question is a disaster. But some voting officials’ vehemence, especially as documented by Avi Rubin, seemed too strong to be explained purely as not wanting to admit you own mistakes.

Things make more sense if there’s a revolving door between state election officials and voting equipment vendors. You don’t even have to imagine explicit corruption. If many of your friends and former colleagues work for e-voting vendors, you’re more likely to believe them than some Ivory Tower security researcher you’ve never heard of.

I also think this is another reason that touch-screen voting machines are a bad idea—even with paper trails, audits, and the rest. Voting machine vendors have an incentive to make their products as complicated as possible so that they can charge the state more money for them. Making a touch-screen machine more secure means buying more hardware—fancier printers and diagnostic and auditing tools. On the other hand, making paper balloting more secure mostly means investing more in human inputs—hiring more election observers, giving election judges more training, conducting more hand recounts. Those aren’t things for which voting equipment vendors can charge a premium.

A voting machine with a paper trail is still a lot better than a voting machine without one. So I hope the Holt bill passes. But it would be much happier if Congress passed a law simply outlawing the use of touch-screen voting machines. (perhaps with an exception for disabled voters) Such a bill would be a lot shorter and less intrusive, because it wouldn’t include all these extra provisions aimed at papering over the weaknesses of DRE+printer combinations.

The FCC’s 700 MHz plan adopted yesterday embraces, for the most part, Frontline Wireless’s plan for a national public safety network. It’s really an amazing thing considering that nine months ago Frontline Wireless didn’t exist (at least not in public), while Cyren Call had been making noise for months. As I’ve said before, I’m not crazy about Frontline’s plan, but I like it better than Cyren Call’s ill-fated proposal. That said, here are the pros and cons of the new rules ad I see them (and without the benefit of the actual rules in front of me because the FCC apparently hasn’t heard of this publishing technology called the World Wide Web).

Continue reading →

Randy May of the Free State Foundation has a good piece out today, picking up on an prediction by the investment firm of Stifel Nicolaus that the exact meaning of “open access” under yesterday’s 700 MHz decision likely won’t be determined for years. Stifel Nicolaus says 2009 is the likely date — that strikes May (and me) as optimistic, given the eight years it took to settle the unbundling rules under the 1996 telecom act.

This definitional long tail has consequences, May points out. This is because that veritable economic theorem that “people don’t want to provide a pig in a poke” holds true, even for the FCC. “Think about it,” he says. “In how many auctions have you bid when the rules concerning what you can do with your winning bid won’t be known until several years later?”

A good, but hardly reassuring, point. So you might as well get comfortable. This may go on for a while.

Anti-Point-by-Point

by on August 1, 2007 · 7 comments

This week’s cavalcade of 700 MHz posts was an interesting opportunity to see and explore the interesting divisions on the issues among TLFers and with our friends in the commentsphere.

Because I have just finished a rejoinder in one, I noticed that a couple of these posts broke out into point-by-point discussions. The response I just wrote to TLF friend Doug Lay stretched to about two pages of text – and I never want to do that again!

Is there some practice we should all engage in to minimize excessively long point-by-point discussions? They are very time consuming, and they may not benefit visitors to the site very much – other than those of us point-by-pointing each other.

True, these are multi-faceted issues, but it might make sense for commenters and TLFers both to focus carefully on the precise thrust of each post, as best we can discern them. Alternatively, when a point-by-point breaks out, we could make the tangents into new posts and let the discussions of each blossom in its own little hothouse.

Ideas?

(A point-by-point response in the comments to all I’ve said here would not – well, yes it would – be funny.)