Open Standards vs. IBM – Remembering the MicroChannel Architecture

by on December 9, 2004

Yesterday’s news that IBM sold its PC division to Lenovo got me all sentimental. I worked at IBM’s PC support headquarters in Research Triangle Park, NC for a bit about ten years ago, back when IBM was going through some turbulent times. The future of the mainframe business was definitely looking grim, but its PC division – the future of computing – had already lost its dominance in the PC market and was struggling to sell its OS/2 operating system. And I remember some jaded folks there that blamed IBM’s fall (at least partly) on an obscure proprietary technical feature called MicroChannel (MCA) – an example of how a closed standard can be bad for business.

IBM had reached 40% market share by 1985. But its open (non-IP protected) architecture meant that the PC was easily “cloned.” According to this site, Compaq was “the first with an 80386-based machine in 1986. IBM attempted to re-establish control over the PC platform in 1987 with a homegrown replacement for the DOS operating system, OS/2, and the introduction of the PS/2 based on the proprietary MicroChannel architecture. Neither had the desired effect. By 1995, IBM’s share had fallen to 7.3% behind Compaq at 10.5%, and in 2003, IBM (6%) was a distant third behind Dell (16.3%) and HP (16.9%).” What is a bus? (interface between a computer’s CPU and its expansion cards and their associated devices – MCA was 32 bit, ISA was 16 bit).

The problem, as this site documents, was that MCA, while technologically superior to the industry standard ISA bus, was not what the market demanded.

“Not only was it proprietary and subject to licensing requirements, but it relied upon an unwieldy and expensive registration process. Anyone marketing an MCA card had to register it with IBM, because each device required a unique ID number and compatibility certification. Nobody could make Micro Channel machines without IBM’s permission, and nobody could make Micro Channel cards without going through IBM for certification and registration.” Here a dominant company comes out with a great product and it gets slapped down for demanding ten percent royalties and arduous registration procedures. The industry soon developed EISA (32 bit) and then PCI (64 bit) – the market responded.

Now software standards are gaining a lot of attention, especially email authentication (the FTC even held a conference on this – see CEI’s comments). Proprietary standards may have a role here, but the market will likewise punish those that go the proprietary route unnecessarily. As an example, the first Windows mail products used the Microsoft Mail standard instead of implementing the open POP mailbox standard. But the market ignored Microsoft Mail so Microsoft made Outlook Express, which can use POP or IMAP.

There are many proponents of government regulation that would prefer seeing government intervention into the process, either through direct control immediately at the beginning of the process or post facto via antitrust enforcement. This would add unnecessary and costly political control to the technology development process. Standards setting will never be a perfect process – consensus is always hard, it will proceed more slowly than some would like, etc. But the market works here – companies have incentives to offer up as an open standard even products protected by intellectual property laws. See Microsoft’s Sender ID, a domain-level email authentication program (similar to “caller-ID” on phones). Although MS owns the patent on this, it is offering up a royalty-free license – getting rid of spam is good business it seems, and so are open standards.

Comments on this entry are closed.

Previous post:

Next post: