In a New York Times op-ed this weekend entitled “You Can’t Say That on the Internet,” Evgeny Morozov, author of The Net Delusion, worries that Silicon Valley is imposing a “deeply conservative” “new prudishness” on modern society. The cause, he says, are “dour, one-dimensional algorithms, the mathematical constructs that automatically determine the limits of what is culturally acceptable.” He proposes that some form of external algorithmic auditing be undertaken to counter this supposed problem. Here’s how he puts it in the conclusion of his essay:
Quaint prudishness, excessive enforcement of copyright, unneeded damage to our reputations: algorithmic gatekeeping is exacting a high toll on our public life. Instead of treating algorithms as a natural, objective reflection of reality, we must take them apart and closely examine each line of code.
Can we do it without hurting Silicon Valley’s business model? The world of finance, facing a similar problem, offers a clue. After several disasters caused by algorithmic trading earlier this year, authorities in Hong Kong and Australia drafted proposals to establish regular independent audits of the design, development and modifications of computer systems used in such trades. Why couldn’t auditors do the same to Google?
Silicon Valley wouldn’t have to disclose its proprietary algorithms, only share them with the auditors. A drastic measure? Perhaps. But it’s one that is proportional to the growing clout technology companies have in reshaping not only our economy but also our culture.
It should be noted that in a Slate essay this past January, Morozov had also proposed that steps be taken to root out lies, deceptions, and conspiracy theories on the Internet. Morozov was particularly worried about “denialists of global warming or benefits of vaccination,” but he also wondered how we might deal with 9/11 conspiracy theorists, the anti-Darwinian intelligent design movement, and those that refuse to accept the link between HIV and AIDS.
To deal with that supposed problem, he recommended that Google “come up with a database of disputed claims” or “exercise a heavier curatorial control in presenting search results,” to weed out such things. He suggested that the other option “is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues” that someone (he never says who) determines to be conspiratorial or anti-scientific in nature.
Taken together, these essays can be viewed as a preliminary sketch of what could become a comprehensive information control apparatus instituted at the code layer of the Internet. Continue reading →