At yesterday’s Gov2.0 Summit conference, "rogue archivist" Carl Malamud gave a great speech about what’s wrong with government IT and what should be done about it.
"If our government is to do the jobs with which we have entrusted it, … the machinery of our government must first be made to work properly."
Malamud describes a government IT landscape that is a “vast wasteland of contracts that lie fallow inside this beltway” because of agency capture by special interests and proposes three steps to fix government IT:
- Finish the opengov revolution – create and enforce bulk data standards, release more government data using those standards, and update the Freedom of Information Act for the Internet age to require that any data released in response to a FOIA request is also posted online for anyone to access (others have already taken up this cause)
- Create a National Scanning Initiative – Spend at least $250 million per year (a third of what the Smithsonian currently receives from the Federal government) for a decade to put all of the works housed at the Smithsonian, the National Archives, the Library of Congress, the National Library of Medicine, and the Government Printing Office online
- Create a Computer Commission with authority to conduct agency-by-agency reviews and change projects from relying on over-designed custom systems to ones based on open-source building blocks and judicious use of commercial off-the-shelf components
O’Reilly’s Jim Stogdill believes that Malamud’s speech is an implicit recognition that Federal IT projects are just too big for the typical top-down IT development process and the better approach is “structuring incentives, policies, and ecosystems to encourage the complex to emerge from the simple.” This approach is basically the Unix philosophy, which is best summarized as "Design programs to do only a single thing, but to do it well, and to work together well with other programs."
One big problem with most government software projects is that they’re developed without any thought of having those systems interact with other systems. As a result, data files are typically proprietary and importing and exporting data is impossible. But if federal IT projects were developed more in line with the Unix philosophy, as smaller, modular, interoperable systems, they would be more manageable and problems with a specific component would not jeopardize other systems.
And as Stogdill points out, there are only a few companies able to deal with the complexity of the Federal Aquisition Rules and the scale typical of most government projects. Breaking things into smaller components and open-sourcing the code developed on all new projects will enable many more companies to compete for these contracts.
The Obama administration is the first presidency to have a Chief Information Officer. I only hope he was listening to Malamud’s speech.