With China’s Internet filtering back in the spotlight, this is as good a time as any to rewatch Clay Shirky’s excellent TED talk on the political implications of the ongoing media revolution—with a fascinating case study of a recent episode in the People’s Republic.
Two points that probably deserve emphasis. The first is that the explosion of user generated content in one sense makes the control of search engines even more important for a regime that’s trying to limit access to politically inconvenient information. You can block access to Amnesty International, and you can even try to play whack-a-mole with all the mirrors that pop up, but when the ideas you’re trying to suppress can essentially crop up anywhere, a strategy that relies on targeting sites is going to be hopeless. The search engine is a choke point: You can’t block off access to every place where someone might talk about the Tiananmen massacre, but if you can lock down people’s capacity to search for “Tiananmen massacre,” you can do the next best thing, which is making it very difficult for people to find those places. There are always innumerable workarounds for simple text filters (“Ti@n@nm3n”) but if people are looking for pages, the searchers and the content producers need to converge on the same workaround, by which point the authorities are probably aware of it as well and able to add it to the filter. It’s the same reason people who want to shut down illegal BitTorrent traffic have to focus on the trackers.
The second point, however, is that social media also erodes the value of the search engine as a choke point, because it transforms the community itself into the search engine. For many broad categories of question I might want answered, I will get better information more rapidly by asking Twitter than by asking Google. Marshall McLuhan called media “the extensions of man,” because they amplify and extend the function of our biological nervous systems: The screen as prosthetic eye, the speaker as prosthetic ear, the book or the database as external memory storage. The really radical step is to make our nervous systems extensions of each other—to make man the extension of man. That’s hugely more difficult to filter effectively because it makes the generation of the medium’s content endogenous to the use of the medium. You can ban books on a certain topic because a static object gives you a locus of control; a conversation is a moving target. Hence, as Shirky describes, China just had to shut down Twitter on the Tienanmen anniversary, because there was no feasible way to filter it in realtime.
An analogy to public key encryption might be apt here. The classic problem of secure communications is that you needed a secure channel to transmit the key: The process of securing your transmission against attack was itself a point of vulnerability. You had to openly agree to a code before you could start speaking in code. The classic problem of free communication is that the censors can see the method you’re attempting to evade censorship. Diffie-Hellman handshaking solves the security problem because an interactive connection between sufficiently smart systems lets you negotiate an idiosyncratic set of session keys without actually transmitting it. A conversation can similarly negotiate its own terms; given sufficient ingenuity, I can make it clear to a savvy listener that I intend for us to discuss Tienanmen in such-and-such a fashion, and the most you can do with any finite set of forbidden terms and phrases is slow the process down slightly.
This is a big part of why, pace folks like Tim Wu, I’ll still allow myself to get into the spirit of ’96 every now and again. They can, to be sure, resolve to shut down Twitter and try to throw enough people in jail to intimidate folks into “self discipline,” as they charmingly term it. But the strategies of control available become hugely more costly when the function of the medium is less to connect people with information than to connect them to each other.