Jerry Brito and WCITLeaks co-creator Eli Dourado have a conversation about the recent World Conference on International Telecommunications (WCIT), a UN treaty conference that delved into questions of Internet governance.
In the lead-up to WCIT—which was convened to review the International Telecommunication Regulations (ITRs)—access to preparatory reports and proposed modifications to the ITRs was limited to International Telecommunications Union (ITU) member states and a few other privileged parties. Internet freedom advocates worried that the member states would use WCIT as an opportunity to exert control over the Internet. Frustrated by the lack of transparency, Brito and Dourado created WCITLeaks.org, which publishes leaked ITU documents from anonymous sources.
In December, Dourado traveled to Dubai as a member of the U.S. delegation and got an insider’s view of the politics behind international telecommunications policy. Dourado shares his experiences of the conference, what its failure means for the future of Internet freedom, and why the ITU is not as neutral as it claims.
New York University law professor James Grimmelmann eulogizes Aaron Swartz, the open information and internet activist who recently committed suicide in the face of a computer trespass prosecution.
Grimmelmann describes Swartz’s journey from “wunderkind prodigy who came out of nowhere when he was 14” to “classic activist-organizer,” paying special attention to the ideas that motivated his work. According to Grimmelmann, Swartz was primarily interested in power being held by the wrong people and how to overcome it through community organizing. Swartz was dedicated to his personal theory of change and believed that people who know how to use computers have a duty to undermine the closed-access system from within.
It was this ardent belief that led Swartz to surreptitiously download academic articles from JSTOR. Grimmelmann closely analyzes the case, providing a balanced view of both the prosecution’s and Swartz’s view of the issue. Grimmelmann additionally suggests possible policy reforms brought to light by Schwartz’s case.
This isn’t anything innovative, but part of my strategy for improving government transparency is to give public recognition to the political leaders who get ahead on transparency and public disapprobation to those who fall behind. So I have a Cato Institute report coming out Monday that assesses how well government data is being published. (Oversight data, that is: reflecting deliberations, management, and results.)
I went ahead and previewed it on the Cato blog last night. The upshot? I find that President Obama lags House Republicans in terms of data transparency.
Neither are producing stellar data, but Congress’s edge is made more acute by the strong transparency promises the president made as a campaigner in 2008, which are largely unrealized. My pet peeve is the lack of a machine-readable government organization chart, not even at the agency and bureau level. The House is showing modest success and promising signs with some well structured data at docs.house.gov and good potential at beta.congress.gov.
I hustled to get these grades out before the election, and maybe there are one or two marginal voters who this study might sway. How it might sway them is an open question, and I’ve had some interesting reaction to the release of the study, such as: Is this electioneering? Shouldn’t there be an assessment of Romney on transparency? Continue reading →
Elections are coming up, but though we’re well into the 21st century, we still can’t vote online. This archived episode discusses the future of voting.
Joseph Hall, Senior Staff Technologist at the Center for Democracy and Technology and a former postdoctoral researcher at the UC Berkeley School of Information, discusses e-voting. Hall explains the often muddled differences between electronic and internet voting, and talks about security concerns of each. He also talks about benefits and costs of different voting systems, limits to having meaningful recounts with digital voting systems, and why internet voting can be a bad idea.
Perry Keller, Senior Lecturer at the Dickson Poon School of Law at King’s College London, and author of the recently released paper “Sovereignty and Liberty in the Internet Era,” discusses how the internet affects the relationship between the state and the media. According to Keller, media has played a formative role in the development of the modern state and, as it evolves, the way in which the state governs must change as well. However, that does not mean that there is a one-size-fits-all solution. In fact, as Keller demonstrates using real-world examples in the U.S., U.K., E.U., and China, the ways in which new media is governed can differ radically based upon the local legal and cultural environment.
Scott Shackelford, assistant professor of business law and ethics at Indiana University, and author of the soon-to-be-published book Managing Cyber Attacks in International Law, Business, and Relations: In Search of Cyber Peace, explains how polycentric governance could be the answer to modern cybersecurity concerns.
Shackelford originally began researching collective action problems in physical commons, including Antarctica, the deep sea bed, and outer space, where he discovered the efficacy of polycentric governance in addressing these issues. Noting the similarities between these communally owned resources and the Internet, Shackelford was drawn to the idea of polycentric governance as a solution to the collective action problems he identified in the online realm, particularly when it came to cybersecurity.
Shackelford contrasts the bottom-up form of governance characterized by self-organization and networking regulations at multiple levels to the increasingly state-centric approach prevailing in forums like the International Telecommunication Union (ITU). Analyzing the debate between Internet sovereignty and Internet freedom through the lens of polycentric regulation, Shackelford reconceptualizes both cybersecurity and the future of Internet governance.
Vinton Cerf, one of the “fathers of the internet,” discusses what he sees as one of the greatest threats to the internet—the encroachment of the United Nations’ International Telecommunications Union (ITU) into the internet realm. ITU member states will meet this December in Dubai to update international telecommunications regulations and consider proposals to regulate the net. Cerf argues that, as the face of telecommunications is changing, the ITU is attempting to justify its continued existence by expanding its mandate to include the internet. Cerf says that the business model of the internet is fundamentally different from that of traditional telecommunications, and as a result, the ITU’s regulatory model will not work. In place of top-down ITU regulation, Cerf suggests that open multi-stakeholder processes and bilateral agreements may be a better solutions to the challenges of governance on the internet.
A small but growing collection of companies has formed a coalition that will push the federal government to establish a standard system by which agencies categorize their data. …
“Our members understand that if the government identified its data elements in consistent ways, there would be vast new opportunities for the tools that they are building,” Executive Director Hudson Hollister said.
Early supporters include Microsoft and data analysis and management firms Level One Technologies, Teradata, and BrightScope. I’m on their Board of Advisors. One of their early priorities will be to pass H.R. 2146, the DATA Act.
(Here’s a nit I can’t help but pick: The Post says the coalition “aims to standardize ‘big data.'” No. It’s just data.)
The Cato Institute’s jobs page has a new posting. If you have the right mix of data/technical skillz, public policy knowledge, love of freedom, and vim, this could be your chance to advance the ball on government transparency! [Added: For more background on Cato’s transparency work, see this and this.]
Data Curator, Center on Information Policy Studies
The Cato Institute seeks a data curator to support its government data transparency program. This candidate will perform a variety of functions that translate government documents and activities into semantically rich, machine-readable data. Major duties will include reading legislative documents, marking them up with semantic information, and identifying opportunities for automated identification and extraction of semantic information in documents. The candidate will also oversee the data entry process and train and supervise others to perform data entry. The ideal candidate will have a college degree, preferably in computer science and/or political science, and experience using XML, RDFa, and regular expressions. Attention to detail is a must, with an understanding of U.S federal legislative, spending, and regulatory processes preferred.
Applicants should send their resume, cover letter, and a short writing sample to:
Jim Harper
Director of Information Policy Studies
Cato Institute
1000 Massachusetts Ave. NW
Washington, DC 20001
Fax (202) 842-3490
Email: jharper@cato.org
Here’s an exclusive insider tip for TechLiberationFront readers. Don’t send your application by fax! That would send the wrong signal…
Paying close attention to language can reveal what’s going on in the world around you.
Note the simple but important differences between the phrases “open government” and “open government data.” In the former, the adjective “open” modifies the noun “government.” Hearing the phrase, one would rightly expect a government that’s more open. In the latter, “open” and “government” modify the noun “data.” One would expect the data to be open, but the question whether the government is open is left unanswered. The data might reveal something about government, making government open, or it may not.
Recent public policies have stretched the label “open government” to reach any public sector use of [open] technologies. Thus, “open government data” might refer to data that makes the government as a whole more open (that is, more transparent), but might equally well refer to politically neutral public sector disclosures that are easy to reuse, but that may have nothing to do with public accountability.
It’s a worthwhile formal articulation and reminder of a trend I’ve noted in passing once or twice.
There’s nothing wrong with open government data, but the heart of the government transparency effort is getting information about the functioning of government. I think in terms of a subject-matter trio—deliberations, management, and results—data about which makes for a more open, more transparent government. Everything else, while entirely welcome, is just open government data.
The Technology Liberation Front is the tech policy blog dedicated to keeping politicians' hands off the 'net and everything else related to technology. Learn more about TLF →