E-Government & Transparency

Last week, Joe Lieberman and others introduce a bill in the Senate to reauthorize the E-Government Act of 2002. In my new paper about online government transparency I explain how most agencies are likely in compliance with the Act by simply putting their regulatory dockets online, even though those dockets may be largely inaccessible by the public. For example, the FCC’s online docketing system, about which I’ve been griping lately, is probably up to par as far as the Act goes.

The good news is that the reauthorization bill includes an amendment that aims to make federal websites more accessible. It reads in part:

Not later than 1 year after the date of enactment of the E-Government Reauthorization Act of 2007, the Director [of OMB] shall promulgate guidance and best practices to ensure that publicly available online Federal Government information and services are made more accessible to external search capabilities, including commercial and governmental search capabilities. The guidance and best practices shall include guidelines for each agency to test the accessibility of the websites of that agency to external search capabilities. … Effective on and after 2 years after the date of enactment of the E-Government Reauthorization Act of 2007, each agency shall ensure compliance with any guidance promulgated[.]

The purpose of these changes are to make federal sites more easily indexed by commercial search engines, such as Google, which are what most citizens use to find information. Some agencies have begun looking into this already. That is great in itself, but what really interests me here is the notion of “best practices” guidelines with which the agencies must comply. This could be the Trojan Horse that gets XML into federal sites. Once data is available in a structured format, then third parties can use it to create different (and likely better) user interfaces for the data, as well as interesting mashups.

I hope OMB will take this opportunity to revamp their e-gov efforts. Regulations.gov, a site they manage along with EPA, does not offer XML. (I’ve talked about this before here.) It also does abysmally on search engines, perhaps because they use outdated frames markup. A quick check shows Google last indexed the site in January. I sincerely hope this kick-starts things.

When Congress delegates its authority to make laws to unelected regulators, a certain bit of accountability is lost. To make up for this, the Administrative Procedure Act requires regulators to act openly and transparently. They must make publicly available the rules they are considering, must take comments from the public, and must consider these in adopting final rules. As I explain in my new paper, making something publicly available in the Twenty-First Century means putting it online. But merely putting documents online is not enough to be truly transparent. The public has to be able to easily find and access the documents and hopefully also be able to use them in the sort of innovative ways the state of the art allows.

In this installment of my series looking at the FCC’s website, we’ll take a look at the Commission’s online docket system. So what’s wrong with it?

Continue reading →

As promised, here is the first in a series of posts looking at the usefulness of the FCC website. Others, including Michael Marcus and Cynthia Brumfield, have already catalogued just how much in disrepair the site is. (In fact, our own James Gattuso blogged today about the FCC site, which prompted me to finally kick off the series.) I’ve had lots of time to think about this while researching my new government transparency and the Internet paper, so here’s my contribution to the general piling-on.

First, let’s look at search. Given the ever-increasing amount of data online, search is the web’s killer app. If you can’t find it, it doesn’t matter how much useful data is available online. The FCC offers a search bar at the top left of its site. So what does this box search? According to the FCC site:

Search Scope: The FCC Search Engine searches throughout the FCC’s web site, including the Electronic Document Management System (EDOCS), but does not collect information from the FCC’s other databases and electronic filing systems such as the Electronic Comment Filing System (ECFS). Information is collected from web pages and many types of documents including Word, WordPerfect, Acrobat, Excel, and ASCII Text, and is constantly updated.

Right off the bat this tells us that the FCC houses several disparate databases (eight, according to Brumfield), and that they’re not all searched by their main search box. Most notably, their regulatory docket system (ECFS) is not searched. (More on this in a future post.)

If you search for Kevin Martin, this is what you get:

Continue reading →

Regulations.gov, the federal government’s centralized regulatory docketing system that I look at in my new transparency paper, recently won an award from Government Computer News for “combining vision and IT innovations with an attention to detail and a willingness to collaborate.” The result of that award-winning combination, however, is not impressing everyone. A few days later the Congressional Research Service issued a report that catalogs the site’s shortcomings.1 (Another great dissection of Regulations.gov was performed by BNA and reported that “Cornell students studying human-computer interaction, when asked to evaluate the E-Rulemaking Web site’s public interface in early 2006, rated it ‘absolutely horrific[.]'”)

What’s striking to me is how what many believe is an unsatisfactory product is hailed as a success. Despite the hard work that many civil servants no doubt expended trying to make Regulations.gov a useful site, one has to admit it is confusing and difficult to use. Increased traffic is often cited by OMB in reports to Congress (PDF) as a measure of success. Increased web traffic was also mentioned in the GCN story about the award.

Looking at traffic, however, is tallying output, not outcomes; measuring activity, not results. One could conceivably build a website so unnavigable that it results in the number of “web hits” quadrupling because users have such a hard time finding what they need or because they have to click through many links before getting to what they want. Also, a total traffic number is difficult to judge. Are 150 million “hits” a good thing? Relative to what? Who knows.

Instead, what I’d like to know is whether Regulations.gov is making it easier for citizens to find and comment on regulatory proceedings. I see from the site’s “What’s New” section (I’d link to it but I can’t because the site uses 1990s-style frames technology2) that they conduct a regular “customer satisfaction survey.” I’d like to see those results published on the web. That sounds to me like a much better measurement of the site’s effectiveness.

Continue reading →

If you’d like to get a flavor for the sort of impact that one (tenacious) citizen can have on making government data more transparent, check out this Google Tech Talk by one of my personal heroes, Carl Malamud. (I write about his exploits in my new paper on online transparency.) He talks about cajoling the ITU to put standards online, forcing the SEC to put its public information online, and his new project to live-stream and archive video of all congressional and agency hearings in Washington. He’s a real inspiration.

I’ve been laboring for a few months on a paper about government transparency on the internet and I’m happy to say that it’s now available as a working paper. In it I show that a lot of government information that is supposed to be publicly available is only nominally available because it’s not online. When data does make it online it’s often useless; it’s as if the .gov domain has a prohibition on XML and reliable searches.

First I look at independent third parties (such as GovTrack.us) that are doing yeoman’s work by picking up the slack where government fails and making data available online in flexible formats. Then I look at yet other third parties who are taking the liberated data and using them in mashups (such as MAPLight.org) and crowdsourcing (such as our own Jim Harper’s WashingtonWatch.com). Mashups of government data help highlight otherwise hidden connections and crowdsourcing makes light work of sifting through mountains of data. If I may corrupt Eric Raymond’s Linus’s Law for a moment, “Given enough eyeballs, all corruption is shallow.” In the coming days I plan to write a bunch more on how online tools can shed light on government, including a series dissecting the FCC’s website–not for the squeamish.

I believe opening up government to online scrutiny is immensely important. If we’re going to hold government accountable for its actions, we need to know what those actions are. The Sunlight Foundation has been doing fantastic work on this front and I would encourage you to visit them and especially their Open House Project blog. I would also encourage you to send me any comments you might have on my paper as I’m still perfecting it before I submit it to journals.

ITIF E-voting Report

by on September 19, 2007 · 0 comments

Ars reports on a new ITIF report critiquing the push for voter-verified paper trails in electronic voting. It’s a good summary over all, but I found a couple of things wanting about this part of it:

Castro makes his arguments against paper trails largely by ignoring the different role that paper has played (and may again play) during balloting. Currently, paper-only balloting is being suggested as a stop-gap solution for situations where the alternative is the use of electronic systems with recognized flaws; very few propose paper as a long-term solution. But a substantial fraction of the report is dedicated to enumerating the flaws of all-paper systems. Meanwhile, it attempts to use those same flaws to paint any attempts at using paper in any context—including cases where paper would create a supplemental record in electronic voting—as being equally flawed.

The report’s approach to people who oppose electronic voting systems is equally clumsy. A lot of the opposition to electronic voting is not focused on the concept itself, but rather some of the clearly flawed implementations of these systems. Instead of recognizing this distinction, Castro simply paints opponents as paranoid luddites: “Many opponents of electronic voting machines are motivated by a distrust of technology, anger at election results, and conspiracy theories about voting companies.” That sort of language pervades the report; concerns regarding the independence and robustness of voting machine validation apparently doesn’t exist. In Castro’s mind, the only opposition results from ill-informed paranoia: “Because some people do not understand that voting machines must undergo independent testing, they fear that a voting machine may steal their vote.”

I don’t know of any rigorous polling on the subject, but I think it’s an overstatement to say that “very few propose paper as a long-term solution.” Personally, I think that it’s at the very least an open question whether e-voting will ever be secure enough to be trusted, even with a paper trail. Certainly a paper trail is a big step in the right direction, and I certainly think there are strong arguments for making computerize ballot-marking machines available to the disabled. But given the large dangers and small benefits of e-voting, I think it’s a mistake to assume that paper is just a stopgap solution.

Relatedly, I think it’s a little bit misleading to say that e-voting critics are against “clearly flawed implementations” rather than e-voting itself. E-voting critics—including those like Ed Felten and Avi Rubin who support e-voting with a paper trail—emphasize that paperless e-voting fails for fundamental, systematic reasons. They advocate paper trails because they believe we can’t depend on the correctness of software systems that will always be vulnerable to hacking. I guess in some sense the lack of a paper trail is a “flawed implementation,” but I think it makes more sense to say that paperless e-voting is inherently insecure.

With that said, Ars is certainly right that it’s silly to paint e-voting critics as paranoid cranks. Ed Felten has a good post that ends thus:

The real worst-case scenario isn’t divergent paper and electronic records — with their attendant litigation and political discord. The real worst case is an attack or error that never even comes to the attention of election officials or the public, because there isn’t an independent way of catching problems.

That’s exactly right.

Holt Bill Pushed Back Again

by on September 8, 2007 · 2 comments

I can’t say I’m too disappointed that a House vote on the Holt bill has been pushed back once again. Apparently the proximate cause was two Democrats on the rules committee—which normally votes along party lines—bucking the leadership and threatening to vote against bringing the legislation to the floor unless further changes were made. It’s becoming increasingly clear that new rules won’t be ready in time for 2008, which means our focus should be on getting the rules right in 2010 and 2012. And although the Holt bill is a step in the right direction, it certainly leaves substantial room for improvement. Hence, I found the comments of the dissident committee members reassuring:

Slaughter quickly indicated she didn’t like the bill, and raised questions about the quality of the new paper ballot machines.

“I am very much concerned that we are passing this law that you have to have it by a certain date,” Slaughter said during the hearing, “when experts tell us there is not a machine that will do this right.”

In an interview, Slaughter said New York election authorities would have trouble getting equipment to replace their lever-pull machines in time for the deadline mandated in the bill.

She wasn’t the only one to express concerns. Rep. Alcee Hastings, a Democrat from Florida, said the bill didn’t go far enough.

“I need to be persuaded. Otherwise I would do something that I have not done since I have been here, and that is vote against a proposed rule,” Hastings said, according to a transcript. “If we ain’t gonna fix it all, then we oughtn’t fix something that ain’t a fix and is not an assurance that we have done the best we can. This isn’t good enough for me.”

These are precisely the questions House members ought to be asking: are these deadlines feasible, and will this legislation fix the problem or will it require the next Congress to come back and deal with the problem yet again? The obvious compromise is to completely drop the new requirements for 2008 in exchange for more robust requirements (e.g. source code disclosure and no thermal printers) in 2010 and beyond. I don’t know if that’s where Slaughter and Hastings are headed, but at least they’re asking some good questions.

E-Voting Guidelines

by on September 6, 2007 · 0 comments

Threat level reports that the Election Assistance Commission will soon be collecting comments on the latest draft of new e-voting security guidelines.

When Schools Matter

by on September 5, 2007 · 0 comments

The other thing to say about Paul Graham’s essay is that success at founding startups seems like almost the worst possible metric for judging the value of an Ivy League education. At least the way Graham tells it, to succeed at a startup, you have to be reasonably smart, extremely dedicated, and willing to break a few rules in order to find a new way of doing things. Intelligence is useful for getting into a good college as well, but the other major criteria are almost exactly the opposite. To get into an Ivy League school, you need to be good at following rules, sucking up to grown-ups, and performing activities that look good on your resume whether or not you’re actually interested in them. In some cases, getting into an elite school is helped by having access to good tutors, career counselors, test prep coaching, and in some cases parents willing to make 5-figure donations to their alma maters. The sort of intense, deep, and sustained interest in a single subject that is essential to success at a startup is hard to convey on a college application form or a resume.

Now, the thing is, the skills that are required to get into an elite school actually are useful in a lot of high-status careers. For example, becoming a good doctor involves achieving proficiency at a lot of different aspects of medical practice. You don’t really care if your doctor is capable of devoting months of intense effort to a hard technical problem, as long as he can correctly diagnose your condition and competently administer the remedy.

So I think the question of whether someone’s Ivy League background matters is largely a function of what qualities you’re looking for. An Ivy League degree is a good signal for the kinds of qualities that allow people to get admitted to Ivy League schools. If you’re in a business in which those qualities aren’t important, as Graham is, then it obviously doesn’t make sense to pay attention to where someone went to school.