One of the more puzzling changes in Apple’s newly released Mac OS X Snow Leopard operating system is that it now reports file sizes and storage capacity in base 10 units instead of base 2 units.

Until now, operating systems have always displayed file sizes in base 2 units. When measured this way, a gigabyte is 1024 3 bytes (1,073,741,824 bytes).

But when measured in base 10 units, a gigabyte is taken to mean 10 9 bytes (1,000,000,000 bytes). It’s not surprising that hard drive manufacturers generally make a practice of slapping a base 10 measurement on the outside of the product box, presumably to make the total number of gigabytes (or terabytes) appear larger.

Apple’s switch to base 10 measurement was obviously an attempt to put an end to consumer confusion. It probably seemed like an easy way to eliminate pesky calls to customer support from users wondering why the 250 GB hard drive on their new MacBook was showing less than 250 GB total capacity when measured from inside the operating system.

In fact, the consumer confusion resulting from the hardware and software industries’ inconsistency of usage has been a problem for years, even resulting in a number of class-action lawsuits.

There have been attempts to deal with the problem. Over a decade ago, the International Electrotechnical Commission created a number of new binary prefixes in IEC 60027-2. Under this system, 10243 is a gibibyte (or GiB). While this might seem like an elegant solution to an engineer or an etymologist, it fails to make things any clearer for most consumers.

There’s no easy solution to the problem, but it’s clear that all this nonsense should have been avoided in the first place.

I blame Congress.

While they have spent their time debating and legislating all manner of things clearly outside the scope of their constitutional authority, they have neglected their simple enumerated obligations. Article I, Section 8 of the Constitution gives Congress the power “…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures…”

They dropped the ball.

Polish designer Jacek Utko acknowledged that, in the long-run, nothing can save the newspaper as a print medium, but makes a pretty good case newspapers’ ability to  stay afloat while figuring out how to make the transition to digital media depends heavily on shaking up the graphic design and layout of papers.

http://video.ted.com/assets/player/swf/EmbedPlayer.swf If nothing else, this should remind us all that innovation and entrepreneurship aren’t just about technical improvements or better business savvy, but aesthetics, too! The art of commercial culture is like the oxygen we breath: all around us but something we scarcely notice.

Yale Clock TowerThe Wall Street Journal reports today that student loan borrowing for college “in the 2008-09 academic year grew about 25% over the previous year, to $75.1 billion,” with the average student borrowing $13,172 to pay for college. So it should come as an enormous relief that one Internet start-up, StraighterLine, has essentially made the university fully virtual, offering classes for just $99/month.  While this may seem like a boon for students, especially the millions of Americans for whom even community college tuition seems an insurmountable obstacle to climbing up the economic ladder, such “e-Learning” offerings are already, predictably, coming under attack by entrenched interests in “Big Ed” (the professoriat!) as the “media-software–publishing–E-learning-complex.”

In Washington Monthly, Kevin Carey explains why “The next generation of online education could be great for students—and catastrophic for universities.” In a nutshell, the story is the same basic theme of Chris Anderson’s book Free!: digital distribution of information will ultimately drive costs down to zero. Carey shows how universities are essentially facing the same sorts of pressure from disruptive innovation as newspapers—except with more capital costs:

Colleges are caught in the same kind of debt-fueled price spiral that just blew up the real estate market. They’re also in the information business in a time when technology is driving down the cost of selling information to record, destabilizing lows. In combination, these two trends threaten to shake the foundation of the modern university, in much the same way that other seemingly impregnable institutions have been torn apart. In some ways, the upheaval will be a welcome one. Students will benefit enormously from radically lower prices—particularly people like Solvig who lack disposable income and need higher learning to compete in an ever-more treacherous economy. But these huge changes will also seriously threaten the ability of universities to provide all the things beyond teaching on which society depends: science, culture, the transmission of our civilization from one generation to the next. Whether this transformation is a good or a bad thing is something of a moot point—it’s coming, and sooner than you think.

Continue reading →

Interesting piece by Farhad Manjoo of Slate today entitled “So Gmail Was Down. Get Over It.” Manjoo notes that Google’s Gmail service went down briefly this week — for an hour and a half — and that led to a lot of people “freaking out” over the downtime. He asks” “Google’s e-mail service works 99.9 percent of time. Why do we freak out during the other 0.1 percent?”

That’s an good question, but I actually didn’t hear all that many people bitching about it this time around. In fact, I am rather surprised how little I heard about this incident. I think that’s because many of us are gradually growing accustomed to a world in which communications networks and digital devices deliver something less than the holy grail of “five 9s” uptime.  That was the standard for telephony and computing in the world I grew up in: 99.999% was the magic number that network engineers aspired to and that many of us in the public generally demanded.

Today, however, we settle for something less.  As Manjoo’s piece about Gmail suggests, we’ll settle “three 9s,” as in 99.9% reliability.  And sometimes we’ll settle for far less than that. Why is that?  I think Robert Capps has part of the answer in his recent Wired essay, “The Good Enough Revolution: When Cheap and Simple Is Just Fine.” Capps points out the modern Digital Age has seen the “triumph of what might be called Good Enough tech.  Cheap, fast, simple tools are suddenly everywhere.” He continues: Continue reading →

Gotta love The Onion… [Make sure to keep a close eye on the messages on the Twitter pages. And I like the “E-Mom’s” advice to “Just make sure you spell everything wrong and swear a lot” to fool your kids. Great stuff.]

http://www.theonion.com/content/themes/common/assets/onn_embed/embedded_player.swf?image=http%3A%2F%2Fwww.theonion.com%2Fcontent%2Ffiles%2Fimages%2FE-Mom_article_9_1.jpg&videoid=97699&title=Facebook%2C%20Twitter%20Revolutionizing%20How%20Parents%20Stalk%20Their%20College-Aged%20Kids


Facebook, Twitter Revolutionizing How Parents Stalk Their College-Aged Kids

Microsoft is making a major push to integrate social networking tools like Facebook and Twitter into its Bing search engine: users will soon be able to “Ping” search results they like to their friends directly from Bing. Back in January, in “Google, the Innovator’s Dilemma and the Future of Search & Web Ads,” I talked about the implications of this history of search from the WSJ):

Microsoft missed its opportunities to get into paid search not because it was “dumb,” “uninnovative” or a “bad” company, but for the same sorts of reasons that big, highly successful and even particularly innovative companies fail.  The reasons companies generally succeed in mastering “adaptive” innovation of the technologies behind their established business models are the very reasons why such great companies struggle to encourage or channel the “disruptive” innovation that renders their core technologies and business models obsolete.  This dynamic was described brilliantly in Harvard Business School professor Clayton Christensen’s classic 1997 book The Innovator’s Dilemma:  When New Technologies Cause Great Firms to Fail… Let’s hope that Microsoft—as well as Yahoo!—have carefully studied the vast literature produced by business schools in the wake of Christensen’s book about how big companies can avoid the Innovator’s Dilemma by promoting—and capitalizing on—radical innovation from within.  Indeed, this seems to be precisely what has guided Google’s own strategy as it has grown from “disruptive innovator” to become the very sort of behemoth that cannot easily escape the Dilemma, even if corporate managers are fully aware of the problem on a theoretical level.  If Google can do it, Microsoft should be able to, too.  But let’s also not discount the possibility that, no matter how hard Google’s management might try to retain the innovative culture of a start-up, the giant  can’t do that well enough to prevent its own apparent market dominance from being disrupted by new upstart innovators in search and advertising technologies.

My prediction seems to be coming true: Microsoft, with less to lose and without a huge installed user base to worry about annoying by violating Google’s “Prime Directive” of elegant simplicity, may have an easier time introducing “disruptive” innovations to search than Google. Of course, it’s unlikely that any one feature will prove the “killer app” that suddenly causes Bing’s market share to explode—and Google’s to plummet—but a steady stream of such nifty features could convince many users to switch to Bing.

At 29, I’m old enough to remember when Microsoft seemed as cool as Google does today. Hell, I remember being thrilled as a sophomore in high school by Bill Gates’ 1995 book The Road Ahead and the accompanying CD-ROM (which included, as I recall, a tour of Gates’s ultra-futuristic home).  If Microsoft can “get its mojo back,” the company could truly become a web services provider to rival Google.  We’d all benefit from having more choices in search engines, advertising platforms and related tools. And, driving each other to “build a better mousetrap,” the two companies could lead us down the “Road Ahead” from Search 2.0 to Search 3.0 and beyond. So here’s to hoping that Redmond can solve the “Innovator’s Dilemma” with tools like Google’s “20 percent” time that free engineers to innovate!

A number of conservative blogs have picked up on reports that the Obama administration is looking to data mine users on social networking sites. Reports CNS News:flag_at_whitehouse_gov

Anyone who posts comments on the White House’s Facebook, MySpace, YouTube and Twitter pages will have their statements captured and permanently archived by the federal government, according to a plan that the White House is now seeking a contractor to carry out.

Whenever government is collecting information about private citizens, we should be concerned. But this controversy smells a lot like privacy fear-mongering, even though it involves government. If you post a comment to an “official” Obama administration page on a social networking site, it seems only natural that it’s fair game for data mining. The same goes if you post a video response on a publicly accessible site.

If you’re posting controversial statements online under your real name for the public to see, what do you expect will happen? Anybody in the world who has an Internet connection can log your postings, so why shouldn’t government officials be able to do the same? Until government starts pressuring Facebook or Myspace to hand over data that’s being collected on an involuntary basis, I don’t see a whole lot here to worry about.

This controversy, and the flap over flag@whitehouse.gov from a few weeks back, raise another interesting question: should Congress reexamine the Presidential Records Act (PRA) of 1978? This is the law that governs Presidential record-keeping. According to some commentators, if the administration solicits data on its critics, it is obligated under the PRA to retain that data indefinitely. I haven’t read the law, but at first glance it appears that it may have some serious deficiencies. This is is hardly surprising, of course, given that the Internet — let alone social networks — didn’t even exist when the PRA was enacted in 1978.

Berin has already done a fine job tearing apart this latest effort by 10 activist groups to break the Internet by imposing burdensome regulation or punishing legal liability on Internet operators for the crime of trying to deliver relevant advertising to users that can actually pay for the content and services given away to users for free. To that, I would add my deep disappointment that the Electronic Freedom Foundation (EFF) choose to join this cabal.  After all, the other members of the coalition are frequently heard calling for regulation of one variety or another. But EFF always prides itself on supposedly avoiding online regulatory schemes.  That’s what makes it so surprising that they chose to jump on this bandwagon for an Internet industrial policy in the name of “protecting privacy.”

EFF’s embrace of regulation is particularly inconsistent given their excellent filing in the FCC’s “Child Safe Viewing Act” proceeding this summer.  As I’ve previously noted, this proceeding raises the specter of “convergence-era content regulation” with Congress authorizing the FCC to look into “advanced blocking controls” for “wired, wireless, and Internet” platforms.  EFF’s comments rightly stressed dangers of expanded content controls or Internet regulation, and noted the many “less-restrictive means” available to the public that provide compelling alternatives to government regulation:  “Blocking technologies are widely available in the market and do not require further government support.”  And EFF has been instrumental throughout the years of making the case in courts for applying the less-restrictive means test and strict scrutiny when it comes to government efforts to regulate speech.

Why, then, does EFF take the diametrically opposite position when privacy concerns enter the picture? Continue reading →

A few gems from George Gilder’s 1990 masterpiece Microcosm: the Quantum Revolution in Economics & Technology as I work my way through the book:

Predatory Pricing. Gilder details how early microchip manufacturers created wholly new markets put Say’s law into action: supply creating its own demand.  Not only did these companies introduce new technologies, but they created demand by slashing the prices of those technologies by multiple orders of magnitude (10-10,000x) even before they figured out how to lower production costs enough to make even a small profit. While such practices would later give rise to charges of “predatory pricing” and “dumping,” Gilder explains:

Selling below cost is the crux of all enterprise.  It regularly transforms expensive and cumbersome luxuries into elegant mass products.  It has been the genius of American industry since the era when Rockefeller and Carnegie radically reduced the prices of oil and steel. (122)

The Learning Curve: Gilder explains the dynamic by which prices drop so consistently in innovative new industries:

Early in the life of a product, uncertainty afflicts every part of the process. An unstable process means energy use per unit will be at its height. Both fuels and materials are wasted. High informational entropy in the process also produces high physical entropy. The benefits of the learning curve largely reflect the replacement of uncertainty with knowledge. The result can be a production process using less materials, less fuel, less reworks, narrower tolerances, and less supervision, overcoming entropy of all forms with information. This curve, in all its implications, is the fundamental law of economic growth and progress. (125)

The Curve of the Mind: Gilder explains the broader implications of the Learning Curve to the competitiveness of the market economy, and how easily yesterday’s giants can become tomorrow’s easy prey: Continue reading →

I ponder Canadian health care and directions for U.S. reform on the Convergence Law Institute Blog here.