Tim’s thoughtful analysis of the slow adoption of the IPv6 protocol turned my mind to a long-standing topic of interest: the illusory value of elegance in technology. A corollary: In technology, as in life, revolutions are rightly rare and usually only visible in hindsight.
The IPv6 transition is a good example of the difference between policy and implementation. This transition raises all sorts of broad policy issues, given its potential costs and the potential for disruption. For certain kinds of network applications, carefully tuned to use existing Internet infrastructure, the transition will be difficult and costly. In some cases, things may just stop working. At the least, those who work on Internet applications and infrastructure will have to learn all the minute details of the new system and its implementation, a surprisingly deep pool of knowledge, while their IPv4 experience fades into irrelevance. These are no small things.
When naive engineers (and those who think like them) drive policy, their recommendations are often to scrap existing systems and start anew with something that’s more elegant that eliminates “cruft” and the like. It’s a fun engineering task to go back to first principles and start over with what we know now that we may not have known when creating earlier standards. It is a rewarding intellectual exercise.
But “muddling through,” as in other domains, is often the best choice in tech policy.
A few examples:
- Windows is nothing if not a prime example of muddling through. While many techies are drawn to more elegant systems (e.g., BeOS and Linux, which I would argue is only marginally more elegant than Windows as it is commonly used), most businesses and users just want something that works, something that causes the least disruption.
- The original MacOS far outlived any technical justification for its myriad shortcomings and yet was a key part of the successful transition to the more elegant OS X. The need to muddle through undoubtedly made the road to a new Mac OS (including the Copland detour) far lengthier and more complicated than it otherwise would have been. But the benefits were also great. The first release of Rhapsody, for example, was terrible; OS X 10.2 (“Jaguar”) marked the beginning of a period of excellence. Jaguar’s key difference from Rhapsody? Lots and lots of integrated legacy code and features.
- As Tim points out, the continued reign of the x86 processor family in the face of more elegant RISC designs outraged tech-heads for years. As so often happens, though, as technology evolved, x86 grew to embrace RISC designs, which now, in effect, lay at the heart of the platform. So the engineers were right, in a sense, that CISC was a dead-end over the long run, but scrapping its most popular implementation and starting over from a blank slate wasn’t the best solution either.
- Another wrinkle to the processor story: Intel and HP, you may recall, pumped billions of dollars into the ill-fated Itanium platform, designed to be an out-and-out replacement for x86, piggybacking on the “inevitable” transition to 64-bit computing. While the chips had an x86 compatibility mode, it was never on par with actual x86 chips. AMD, meanwhile, bolted simple 64-bit capabilities atop the existing x86 instruction set, changing things as little as possible. That design is now dominant, and Itanium (dubbed “Itanic” by the tech press) is at best a niche product for Intel.
- QWERTY. The debate over whether QWERTY is a benefit over Dvorak-style keyboard layouts still rages after decades of disagreement. The evidence is unclear as to which is faster, but Slashdot and other tech sites brim with Dvorak adherents certain of the superiority of their keyboards. Accepting their claims ad arguendum, Dvorak, while perhaps a better engineering solution, is a terrible human solution. Many of us can barely touch-type in QWERTY as it is; being forced to learn another system, even a marginally superior one, would be a disaster.
- HTTP: It is a stateless protocol and, for a variety of reasons, is particularly ill-suited to applications like streaming audio and video. And yet. It passes through firewalls unmolested and benefits from a huge variety of excellent implementations in more or less every development language and environment.
- The granddaddy of them all: The C programming language. No one should have to program in C. It’s worth repeating: No one should have to program in C. For your first several years programming in C, shooting yourself in the foot, repeatedly, is inevitable. Your programs will break; your code will become unmaintainable, requiring frequent refactoring; compiler upgrades will wreak havoc on more or less everything; you will expose serious security holes; and corner cases that you have not considered will rear their heads even long after your code has entered production. Higher-level languages are easier, faster to develop in, include trigger locks, and run fast enough on nearly all modern hardware. But still people program in C–why? There’s an enormous amount of highly-tuned, well-designed C code extant that has survived the test of time. C gets you close (enough) to the metal, for when you need to eke out every bit of performance possible. C works across basically every platform pretty consistently. And a lot of people know how to program in C. Starting over, no one would create this snake of a language, but as things evolved, we cannot, and should not, ditch it.
A few more examples that I will name but not discuss: The original Netscape codebase, GIFs, POP3, serial ports, VGA, FTP, quirky HTML, the DOC file format and its WMF cruft (though MS is trying real hard), MP3 encoding, etc. The list goes on and on. Readers should add any to the comments that occur to them.
Edmund Burke was understandably wary of the policies based on abstract ideas rather than experience. This differs only a bit, perhaps, from Lawrence Lessig’s “code is law.” What engineers see as cruft are often wisdom and the fruits of evolution, and these things are likely to be better suited to human needs than revolutionary approaches. Perhaps this accounts for much of the success of open-source software, which I and others have long argued is rarely innovative but more iterative. That is not, necessarily, a dismissal of it or any slight, but only a positivist description.
As much as it is disdained in the left-ish policy world and in technology circles, muddling through is frequently the best, most efficient approach. It is a modest approach, one devoted to solving real-world problems and standing on the shoulders of those who solved previous problems. It is not to be dismissed so blithely by those peddling revolutionary solutions.