Articles by Jerry Brito

Jerry is a senior research fellow at the Mercatus Center at George Mason University, and director of its Technology Policy Program. He also serves as adjunct professor of law at GMU. His web site is jerrybrito.com.


847A8369-3817-4AEA-A541-5C418B349DA9.jpgIs Frontline Wireless having a Keyser Söze moment? After convincing the FCC to largely accept its plan for public safety spectrum in the 700 MHz band, the well-connected startup may be saying “poof,” and just like that, be gone. According to Jeffrey Silva in RCR Wireless:

The future of Frontline Wireless L.L.C., the Silicon Valley-backed and politically connected startup that spent months positioning itself to bid big in the upcoming 700 MHz auction, has suddenly become shrouded in mystery. “Frontline is closed for business at this time. We have no further comment,” Frontline said in a statement.

Ars Technica has more as does MRT Magazine, but not much more since it’s all pretty mysterious. The most significant potential impact of the news, as MRT notes, is that “If no one bids on the D Block, the spectrum would be returned to the FCC, which could reauction the spectrum with different rules.” That could be a good or a bad thing depending on the new rules.

Anyone know anything else?

Commerce Secretary Carlos Gutierrez issued this statement on Friday:

The TV Converter Coupon Program opened as scheduled on January 1, and is off to a great start. Americans have begun requesting coupons that will help them get the converter boxes needed for when our television signals change on February 17, 2009. With these coupons, the federal government will defray $40 of the cost of an eligible converter, which is expected to cost between $50 and $70. The demand for coupons is strong. We’ve taken requests from every state for nearly 1.9 million coupons from more than one million households.

The demand is strong? Really? For something that’s free? You’re kidding.

Let’s see, 1.9 million coupons requested at $40 a pop is $76 million of taxpayer money out the door in just four days. As Secretary Gutierrez says, “off to a great start” indeed. At this “great” pace it’s good to know the coupon fund totals $1 billion.

What are you waiting for? Get your piece of the American dream here.

rss-1.jpgIn my recent paper on e-transparency and in other forums I’ve been critical of the federal government’s Regulations.gov website for not offering XML feeds. Well, last week the site began offering an RSS feed for the site. You can see it here.

It looks like it’s a feed of every new proposed rule that is added to the site. Each item has a few elements, including a title, a link to the proposed rule’s page on Regulations.gov, a date, and a category that corresponds to the issuing agency. This is a big step in the right direction and I congratulate the folks who are making this happen. There’s a long way to go, though, in making the most of the technology, and I’d like to offer a few suggestions.

First off, the site isn’t yet offering feeds by agency. A feed of all proposed rulemakings in the government is less valuable to me personally than a feed for just FCC rules. On the other hand, a complete feed could be argued to be even more valuable because a third party could easily parse out the different agencies and offer individual agency feeds. ( Anyone interested in helping a poor, code-impaired guy with a lazyweb request?) Still, individual agency feeds (in addition to a complete feed) should be pretty easy to make available.

Second, there is no description element. In your RSS reader all you get is the title and a link. You have to click the feed item to get to the web page that describes the regulation, etc. Why not include the Federal Register notice right in the feed?

Finally, and this is my dream scenario, why not offer feeds for each rulemaking? Subscribe to a rulemaking and be instantly alerted anytime a new document is filed in the docket? Why not also include the documents as attachments in the feed?

The “what’s new” section of Regulations.gov (which I can’t link to because the site uses dynamic frames!) says that there is more to come in the next few weeks. It says, “The all-new Regulations.gov 2.0 will be launched shortly featuring a powerful new search engine and a re-designed homepage that makes searching, commenting and accessing other site features quicker and easier.” I sure hope so, and I commend the Regs.gov team for their hard work. I also hope they adopt Google’s sitemap protocol to make keyword searches work from anywhere on the web.

The more government information is available online, and the easier it is to access it, the more accountable we can hold government.

The Senate Homeland Security and Governmental Affairs Committee held a hearing today on “E-Government 2.0: Improving Innovation, Collaboration, and Access.” Written testimony from the witnesses is available here. Because the Senate doesn’t make available the audio or video of hearings on their own sites, I made sure to capture it and it’s available here as an MP3 for your listening pleasure.

The impetus for the hearing is the reauthorization bill for the E-Government Act that, as I wrote about earlier, includes new requirements on federal websites that would make them more easily indexed by commercial search engines such as Google. Joe Lieberman chaired the hearing and witnesses were Karen Evans, Administrator of the Office of Electronic Government and Information Technology a OMB, John Needham of Google, Ari Schwartz of CDT, and a clean-shaven Jimmy Wales of Wikipedia. Here are some highlights from the hearing:

Continue reading →

Obama on e-transparency

by on November 14, 2007 · 6 comments

Today Sen. Barack Obama gave a speech at Google where he laid out his tech policy platform. (Platform here in PDF; speech soon available here and here.) There’s much not to like, including a net neutrality regulatory agenda and support for media ownership restrictions, but I’d like to focus on the positive aspect of his speech. In the arena of technology-aided government transparency, Obama laid out a terrific set of ideas that every candidate, Republican or Democrat, should be able to adopt. From his speech:

To seize this moment, we have to use technology to open up our democracy. It’s no coincidence that one of the most secretive Administrations in history has favored special interests and pursued policies that could not stand up to sunlight. As President, I’ll change that. I’ll put government data online in universally accessible formats. I’ll let citizens track federal grants, contracts, earmarks, and lobbyist contacts. I’ll let you participate in government forums, ask questions in real time, offer suggestions that will be reviewed before decisions are made, and let you comment on legislation before it is signed. And to ensure that every government agency is meeting 21st century standards, I’ll appoint the nation’s first Chief Technology Officer. (Emphasis mine.)

I hope whoever becomes president can carry out these technically simple but socially powerful reforms. Mr. Obama himself doesn’t even have to wait to be president to do something about this. He successfully teamed up with Sen. Tom Coburn to bring us the Federal Funding Accountability and Transparency Act. There’s no reason why he shouldn’t try for an encore with a “government data online in universally accessible formats” bill. Heck, adding one sentence to the E-Governemt Act reauthorization bill I wrote about yesterday might just do the trick.

Last week, Joe Lieberman and others introduce a bill in the Senate to reauthorize the E-Government Act of 2002. In my new paper about online government transparency I explain how most agencies are likely in compliance with the Act by simply putting their regulatory dockets online, even though those dockets may be largely inaccessible by the public. For example, the FCC’s online docketing system, about which I’ve been griping lately, is probably up to par as far as the Act goes.

The good news is that the reauthorization bill includes an amendment that aims to make federal websites more accessible. It reads in part:

Not later than 1 year after the date of enactment of the E-Government Reauthorization Act of 2007, the Director [of OMB] shall promulgate guidance and best practices to ensure that publicly available online Federal Government information and services are made more accessible to external search capabilities, including commercial and governmental search capabilities. The guidance and best practices shall include guidelines for each agency to test the accessibility of the websites of that agency to external search capabilities. … Effective on and after 2 years after the date of enactment of the E-Government Reauthorization Act of 2007, each agency shall ensure compliance with any guidance promulgated[.]

The purpose of these changes are to make federal sites more easily indexed by commercial search engines, such as Google, which are what most citizens use to find information. Some agencies have begun looking into this already. That is great in itself, but what really interests me here is the notion of “best practices” guidelines with which the agencies must comply. This could be the Trojan Horse that gets XML into federal sites. Once data is available in a structured format, then third parties can use it to create different (and likely better) user interfaces for the data, as well as interesting mashups.

I hope OMB will take this opportunity to revamp their e-gov efforts. Regulations.gov, a site they manage along with EPA, does not offer XML. (I’ve talked about this before here.) It also does abysmally on search engines, perhaps because they use outdated frames markup. A quick check shows Google last indexed the site in January. I sincerely hope this kick-starts things.

The brilliant Fake Steve Jobs has a great post on Google’s announcement of its new Open Handset Alliance. You should go read it right now because it’s all priceless, but I love this particular bit about openness:

Finally, has anyone else noticed the way Google is kind of desperately grasping at straws lately? They spend years trying to do something other than search and nothing works. Then, despite their big brains and IQ tests, they get totally blindsided by Facebook and have to gin up this ridiculous OpenSocial thing. Just like with this phone thing, they round up all the losers in that social networking space to form some dumbass alliance. You know how it looks? It looks weak. Companies don’t form alliances and consortia when they’re winning. Also, whenever you see companies start talking about being “open,” it means they’re getting their ass kicked. You think Google will be forming an OpenSearch alliance any time soon, to help also-rans in search get a share of the spoils? Me neither.

I love that Kevin Martin put out a press release (PDF because the FCC has apparently never heard of HTML) praising the Open Handset Alliance. So we’ll see press releases from now on each time a communications company announces vaporware?

The New York Times reports on Attributor, a company tackling the broad re-use of copyrighted material online:

The company has developed software that identifies an electronic “fingerprint” for a particular piece of material — an article, a picture, a video. Then it hunts down any place across the Web where a significant chunk of that work has been copied, with or without permission. When the use is unauthorized, Attributor’s software can automatically send a message to the site’s operators, demanding a link back to the original publisher’s site, a share of revenue from any ads on the page, or a halt to the copying.

No word on whether the software also calculates whether unauthorized uses it finds are nevertheless fair uses. That aside, this sort of searching technology should help placate the fears of content owners over the sort of orphan works legislation I’ve proposed.

When Congress delegates its authority to make laws to unelected regulators, a certain bit of accountability is lost. To make up for this, the Administrative Procedure Act requires regulators to act openly and transparently. They must make publicly available the rules they are considering, must take comments from the public, and must consider these in adopting final rules. As I explain in my new paper, making something publicly available in the Twenty-First Century means putting it online. But merely putting documents online is not enough to be truly transparent. The public has to be able to easily find and access the documents and hopefully also be able to use them in the sort of innovative ways the state of the art allows.

In this installment of my series looking at the FCC’s website, we’ll take a look at the Commission’s online docket system. So what’s wrong with it?

Continue reading →

As promised, here is the first in a series of posts looking at the usefulness of the FCC website. Others, including Michael Marcus and Cynthia Brumfield, have already catalogued just how much in disrepair the site is. (In fact, our own James Gattuso blogged today about the FCC site, which prompted me to finally kick off the series.) I’ve had lots of time to think about this while researching my <a href="http://www.mercatus.org/Publications/pubID.4397/pub_detail.asp” title=”Mercatus Center at George Mason University

  • Hack, Mash & Peer”>new government transparency and the Internet paper, so here’s my contribution to the general piling-on.

First, let’s look at search. Given the ever-increasing amount of data online, search is the web’s killer app. If you can’t find it, it doesn’t matter how much useful data is available online. The FCC offers a search bar at the top left of its site. So what does this box search? According to the FCC site:

Search Scope: The FCC Search Engine searches throughout the FCC’s web site, including the Electronic Document Management System (EDOCS), but does not collect information from the FCC’s other databases and electronic filing systems such as the Electronic Comment Filing System (ECFS). Information is collected from web pages and many types of documents including Word, WordPerfect, Acrobat, Excel, and ASCII Text, and is constantly updated.

Right off the bat this tells us that the FCC houses several disparate databases (eight, according to Brumfield), and that they’re not all searched by their main search box. Most notably, their regulatory docket system (ECFS) is not searched. (More on this in a future post.)

If you search for Kevin Martin, this is what you get:

Continue reading →