September 2009

September 8 — this Tuesday — is the deadline for filing objections against the Google Book Settlement. A number of trade associations, corporations, authors, and advocacy groups have weighed in, including thebook-385_609771a Electronic Frontier Foundation and the American Civil Liberties Union. They argue that approving the Google Book Settlement in its current form, without explicitly spelling out data collection practices, would endanger user privacy. EFF and ACLU have threatened to file an objection to the Settlement unless Google commits to a stringent privacy policy for Google Book Search.

I think the privacy risks posed by Google Book Search are being blown out of proportion, as I explained in the Examiner Opinion Zone last month. While EFF and others have raised some legitimate fears about the possibility of government getting its hands on Google Book Search user data, these privacy concerns are not unique to Google Book Search, nor are they legitimate grounds for the court to reject the Google Book Settlement.

In a letter I submitted yesterday as an amicus curiae brief to U.S. District Judge Denny Chin, who is presiding over the Google Books case, I argue that privacy concerns should not determine the court’s evaluation of the Settlement:

Competitive Enterprise Institute Letter

I’ve noted that Google and Microsoft both face what Clayton Christensen famously called the “Innovator’s Dilemma” in trying to handle disruptive innovation in search technology. But noting Microsoft’s innovations in bringing social functionality to search with its “Ping” tools in Bing, I pointed out a few days ago that, “Microsoft, with less to lose and without a huge installed user base to worry about annoying by violating Google’s ‘Prime Directive’ of elegant simplicity, may have an easier time introducing ‘disruptive’ innovations to search than Google.”

The trick will be for Microsoft to find ways of promoting radical innovation from inside, despite the forces of inertia inherent in any large company. One way to do that, as I noted, would be by imitating Google’s “20 percent” program. But a more radical way would be for Microsoft to make Bing a “skunkworks” much like Lockheed Martin’s original “skunkworks,” Xerox’s Palo Alto Research Center (PARC), AT&T’s Bell Labs, GM’s Saturn Motors—or Microsoft’s own XBox. That’s precisely what SEO guru Rand Fishkin (CEO of SEOmoz) suggests Microsoft needs to do to “get serious” in an interview with Affilorama:

I think Google[‘s search market share] could be reduced from like 85% to like 75%, and you could see Microsoft, basically Bing taking over 25%. I don’t think they’ll get more than that. I don’t think they have the ability to do it. Until or unless they are willing do with Bing what they did with Xbox.

So Microsoft had, you know, the game market was well established – Sony competing head to head with Nintendo and other players like Neo Geo coming in and this kind of thing and how is Microsoft going to win this? They didn’t know the first thing about it, you know, they weren’t in this field. So what they did with XBox is they made it a startup. They didn’t even put it on Microsoft campus, they made it a different team of people who were only reporting to Xbox people, they basically built a separate company. The fact that it was owned by Microsoft just means that they get the benefits of the cash and the relationships. That’s extremely powerful. The fact that they’re unwilling to do this with search tells me they’re not serious about it. Right? So you might hear like Steve Balmer and other executives from Microsoft say like “search is very important to us, we’re really serious about it”. I think it’s like “serious to them” and I’m using air quotes here, like serious to them in the same way that Google says “competing with Microsoft Office is serious to us”. It’s just sort of like, “Oh yeah?! You’re going to fight us there, well we’re going to fight you on this front!” Like, serious my ass. I just don’t see it.

If they do serious and spin it out, I’ll be interested – I’ll be very interested if it becomes it’s own startup if it becomes like its own XBox, that kind of thing, that could be exciting – that could be interesting.

I’m pleased to welcome Brooke Oberwetter back to the TLF after 2.5 year stint working for The Man. Make no mistake about it, she’s a hard-core TechLiberationista, having worked as a policy analyst at the Competitive Enterprise Institute and research assistant at the Cato Institute. She’s now a freelance writer in Washington, DC.  (In fact, she lives just down the street from me on the Yuppie Frontier of Shaw!)

Brooke achieved international celebrity as “The Jefferson 1” after she was arrested in a non-violent, silent iPod-toting flashmob celebration of Thomas Jefferson’s birthday at the Jefferson Memorial on April 13, 2008.  I was there that night to see the petty tyranny of the State, that Coldest of all Cold Monsters, in action. I can only say that we couldn’t have asked for a better or more articulate martyr for the cause of Liberty. See for yourself what happened:

One of the more puzzling changes in Apple’s newly released Mac OS X Snow Leopard operating system is that it now reports file sizes and storage capacity in base 10 units instead of base 2 units.

Until now, operating systems have always displayed file sizes in base 2 units. When measured this way, a gigabyte is 10243 bytes (1,073,741,824 bytes).

But when measured in base 10 units, a gigabyte is taken to mean 109 bytes (1,000,000,000 bytes). It’s not surprising that hard drive manufacturers generally make a practice of slapping a base 10 measurement on the outside of the product box, presumably to make the total number of gigabytes (or terabytes) appear larger.

Apple’s switch to base 10 measurement was obviously an attempt to put an end to consumer confusion. It probably seemed like an easy way to eliminate pesky calls to customer support from users wondering why the 250 GB hard drive on their new MacBook was showing less than 250 GB total capacity when measured from inside the operating system.

In fact, the consumer confusion resulting from the hardware and software industries’ inconsistency of usage has been a problem for years, even resulting in a number of class-action lawsuits.

There have been attempts to deal with the problem. Over a decade ago, the International Electrotechnical Commission created a number of new binary prefixes in IEC 60027-2. Under this system, 10243 is a gibibyte (or GiB). While this might seem like an elegant solution to an engineer or an etymologist, it fails to make things any clearer for most consumers.

There’s no easy solution to the problem, but it’s clear that all this nonsense should have been avoided in the first place.

I blame Congress.

While they have spent their time debating and legislating all manner of things clearly outside the scope of their constitutional authority, they have neglected their simple enumerated obligations. Article I, Section 8 of the Constitution gives Congress the power “…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures…”

They dropped the ball.

Polish designer Jacek Utko acknowledged that, in the long-run, nothing can save the newspaper as a print medium, but makes a pretty good case newspapers’ ability to  stay afloat while figuring out how to make the transition to digital media depends heavily on shaking up the graphic design and layout of papers.

If nothing else, this should remind us all that innovation and entrepreneurship aren’t just about technical improvements or better business savvy, but aesthetics, too! The art of commercial culture is like the oxygen we breath: all around us but something we scarcely notice.

Yale Clock TowerThe Wall Street Journal reports today that student loan borrowing for college “in the 2008-09 academic year grew about 25% over the previous year, to $75.1 billion,” with the average student borrowing $13,172 to pay for college. So it should come as an enormous relief that one Internet start-up, StraighterLine, has essentially made the university fully virtual, offering classes for just $99/month.  While this may seem like a boon for students, especially the millions of Americans for whom even community college tuition seems an insurmountable obstacle to climbing up the economic ladder, such “e-Learning” offerings are already, predictably, coming under attack by entrenched interests in “Big Ed” (the professoriat!) as the “media-software–publishing–E-learning-complex.”

In Washington Monthly, Kevin Carey explains why “The next generation of online education could be great for students—and catastrophic for universities.” In a nutshell, the story is the same basic theme of Chris Anderson’s book Free!: digital distribution of information will ultimately drive costs down to zero. Carey shows how universities are essentially facing the same sorts of pressure from disruptive innovation as newspapers—except with more capital costs:

Colleges are caught in the same kind of debt-fueled price spiral that just blew up the real estate market. They’re also in the information business in a time when technology is driving down the cost of selling information to record, destabilizing lows.

In combination, these two trends threaten to shake the foundation of the modern university, in much the same way that other seemingly impregnable institutions have been torn apart. In some ways, the upheaval will be a welcome one. Students will benefit enormously from radically lower prices—particularly people like Solvig who lack disposable income and need higher learning to compete in an ever-more treacherous economy. But these huge changes will also seriously threaten the ability of universities to provide all the things beyond teaching on which society depends: science, culture, the transmission of our civilization from one generation to the next.

Whether this transformation is a good or a bad thing is something of a moot point—it’s coming, and sooner than you think.

Continue reading →

Interesting piece by Farhad Manjoo of Slate today entitled “So Gmail Was Down. Get Over It.” Manjoo notes that Google’s Gmail service went down briefly this week — for an hour and a half — and that led to a lot of people “freaking out” over the downtime. He asks” “Google’s e-mail service works 99.9 percent of time. Why do we freak out during the other 0.1 percent?”

That’s an good question, but I actually didn’t hear all that many people bitching about it this time around. In fact, I am rather surprised how little I heard about this incident. I think that’s because many of us are gradually growing accustomed to a world in which communications networks and digital devices deliver something less than the holy grail of “five 9s” uptime.  That was the standard for telephony and computing in the world I grew up in: 99.999% was the magic number that network engineers aspired to and that many of us in the public generally demanded.

Today, however, we settle for something less.  As Manjoo’s piece about Gmail suggests, we’ll settle “three 9s,” as in 99.9% reliability.  And sometimes we’ll settle for far less than that. Why is that?  I think Robert Capps has part of the answer in his recent Wired essay, “The Good Enough Revolution: When Cheap and Simple Is Just Fine.” Capps points out the modern Digital Age has seen the “triumph of what might be called Good Enough tech.  Cheap, fast, simple tools are suddenly everywhere.” He continues: Continue reading →

Gotta love The Onion… [Make sure to keep a close eye on the messages on the Twitter pages. And I like the “E-Mom’s” advice to “Just make sure you spell everything wrong and swear a lot” to fool your kids. Great stuff.]


Facebook, Twitter Revolutionizing How Parents Stalk Their College-Aged Kids

Microsoft is making a major push to integrate social networking tools like Facebook and Twitter into its Bing search engine: users will soon be able to “Ping” search results they like to their friends directly from Bing. Back in January, in “Google, the Innovator’s Dilemma and the Future of Search & Web Ads,” I talked about the implications of this history of search from the WSJ):

Microsoft missed its opportunities to get into paid search not because it was “dumb,” “uninnovative” or a “bad” company, but for the same sorts of reasons that big, highly successful and even particularly innovative companies fail.  The reasons companies generally succeed in mastering “adaptive” innovation of the technologies behind their established business models are the very reasons why such great companies struggle to encourage or channel the “disruptive” innovation that renders their core technologies and business models obsolete.  This dynamic was described brilliantly in Harvard Business School professor Clayton Christensen’s classic 1997 book The Innovator’s Dilemma:  When New Technologies Cause Great Firms to Fail

Let’s hope that Microsoft—as well as Yahoo!—have carefully studied the vast literature produced by business schools in the wake of Christensen’s book about how big companies can avoid the Innovator’s Dilemma by promoting—and capitalizing on—radical innovation from within.  Indeed, this seems to be precisely what has guided Google’s own strategy as it has grown from “disruptive innovator” to become the very sort of behemoth that cannot easily escape the Dilemma, even if corporate managers are fully aware of the problem on a theoretical level.  If Google can do it, Microsoft should be able to, too.  But let’s also not discount the possibility that, no matter how hard Google’s management might try to retain the innovative culture of a start-up, the giant can’t do that well enough to prevent its own apparent market dominance from being disrupted by new upstart innovators in search and advertising technologies.

My prediction seems to be coming true: Microsoft, with less to lose and without a huge installed user base to worry about annoying by violating Google’s “Prime Directive” of elegant simplicity, may have an easier time introducing “disruptive” innovations to search than Google. Of course, it’s unlikely that any one feature will prove the “killer app” that suddenly causes Bing’s market share to explode—and Google’s to plummet—but a steady stream of such nifty features could convince many users to switch to Bing.

At 29, I’m old enough to remember when Microsoft seemed as cool as Google does today. Hell, I remember being thrilled as a sophomore in high school by Bill Gates’ 1995 book The Road Ahead and the accompanying CD-ROM (which included, as I recall, a tour of Gates’s ultra-futuristic home).  If Microsoft can “get its mojo back,” the company could truly become a web services provider to rival Google.  We’d all benefit from having more choices in search engines, advertising platforms and related tools. And, driving each other to “build a better mousetrap,” the two companies could lead us down the “Road Ahead” from Search 2.0 to Search 3.0 and beyond. So here’s to hoping that Redmond can solve the “Innovator’s Dilemma” with tools like Google’s “20 percent” time that free engineers to innovate!

A number of conservative blogs have picked up on reports that the Obama administration is looking to data mine users on social networking sites. Reports CNS News:flag_at_whitehouse_gov

Anyone who posts comments on the White House’s Facebook, MySpace, YouTube and Twitter pages will have their statements captured and permanently archived by the federal government, according to a plan that the White House is now seeking a contractor to carry out.

Whenever government is collecting information about private citizens, we should be concerned. But this controversy smells a lot like privacy fear-mongering, even though it involves government. If you post a comment to an “official” Obama administration page on a social networking site, it seems only natural that it’s fair game for data mining. The same goes if you post a video response on a publicly accessible site.

If you’re posting controversial statements online under your real name for the public to see, what do you expect will happen? Anybody in the world who has an Internet connection can log your postings, so why shouldn’t government officials be able to do the same? Until government starts pressuring Facebook or Myspace to hand over data that’s being collected on an involuntary basis, I don’t see a whole lot here to worry about.

This controversy, and the flap over flag@whitehouse.gov from a few weeks back, raise another interesting question: should Congress reexamine the Presidential Records Act (PRA) of 1978? This is the law that governs Presidential record-keeping. According to some commentators, if the administration solicits data on its critics, it is obligated under the PRA to retain that data indefinitely. I haven’t read the law, but at first glance it appears that it may have some serious deficiencies. This is is hardly surprising, of course, given that the Internet — let alone social networks — didn’t even exist when the PRA was enacted in 1978.