Book Review: Kevin Kelly’s What Technology Wants

by on November 7, 2010 · 4 comments

It’s appropriate that Kevin Kelly’s new book, What Technology Wants, was published in the same year as Jaron Lanier’s You Are Not a Gadget.  Although Kelly and Lanier are on opposite sides of the Internet optimist vs. pessimist divide, they come at the issue of technology’s impact on society in thoughtful, but at times quite controversial, ways.   I found both books to be remarkably interesting, but also, at times, deeply troubling.

For example, Lanier’s book, which I reviewed here in January, contained an excellent critique of the extreme varieties of quixotic techno-utopianism, which he labels “cybernetic totalism.” Lanier was taking on the belief by some extreme digital age optimists that a “hive mind” or “Noosphere” is coming about. He made a strong case for appreciating individuality and stressed caution when it comes to embracing technology in an over-zealous or quasi-religious fashion.  But Lanier’s critique was too sweeping and his worldview too morose.  He unfairly indicts the entire digital generation and wrongly claims most modern culture is moribund and little more than “a petty mashup of preweb culture.”

Kelly’s What Technology Wants is basically You Are Not a Gadget in reverse.  Kelly does a nice job placing modern technological advances in a more reasonable context, but he is also guilty of some of that kooky “noosphere” thinking Lanier nicely critiqued in his book.

Pining for Life in The Matrix?

My reservations with What Technology Wants lie mostly in its bookends, which strive to prove that “the technium” – “the greater, global, massively interconnected system of technology vibrating around us” (p. 11) — is a “force” or even a living “organism” (p. 198) that has a “vital spirit” (p. 41) and which “has its own wants” (p. 15) and “a noticeable measure of autonomy.” (p. 13) “The technium is whispering to itself,” he says. (p. 14)  At times, Kelly even seems to be longing for humanity’s assimilation into the machine or The Matrix. “We can think of technology as our extended body,” he says. (p. 44)  He speaks repeatedly of human-machine “symbiosis.” “We are now symbiotic with technology” (p. 37)  and, apparently, that symbiotic bonding can get pretty intense as “humans are the reproductive organs of technology.” (p. 296)  Sounds a little kinky, but what the hell does that even mean?

I’m not going to sugarcoat what I think of these theories.  Balderdash is one word that comes to mind.  Much of what Kelly sputters in the opening and closing sections of the book sounds like quasi-religious kookiness by a High Lord of the Noosphere.  It’s a bit like the enviro-extremists who proselytize about Gaia theories.  Nutty stuff.

Occasionally Kelly steps back and asks: “Aren’t I assigning way too much consciousness to inanimate objects, and by doing so giving them more power over us than they have, or should have”? (p. 15)  Uh, yes, Kevin!  And when he makes comments about machines like, “standing between it and its power outlets, you can clearly feel its want,” (p. 17) it makes me want to go re-read Daniel H. Wilson’s entertaining little book, How to Survive a Robot Uprising!  In Kelly’s view, Skynet is self-aware, or at least gradually on its way to sentience, and that’s not necessarily such a bad thing. Resistance is futile.

Beautiful Redemption: Kelly’s Well-Reasoned Case for Technological Experimentation & Change

But what’s so peculiar about Kelly’s book is how the meat of it – Parts 2 and 3 – largely ignore the theories advanced in the bookends of Parts 1 and 4.  Instead, in the middle, Kelly pens a magisterial treatment of the practical – and inevitable – advance of technology.  Had Kelly published the eight chapters in those two sections as a stand-alone volume, I would be prepared to say it was my favorite book about technology since Gilder’s Microcosm or Pool’s Technologies of Freedom.

What Kelly presents there is an extraordinarily balanced, and decidedly non-kooky view of technology and technological change.  I simply cannot recommend those chapters (especially chapters 10 & 12) is strong enough terms.  Gone is most of the Gaia-like talk of the technium as a living organism.  Kelly instead focuses on explaining to us in plain terms the progression of technology in our lives and how we’ve come to cope with it.  He notes, for example, that:

Over the centuries, societies have declared many technologies to be dangerous, economically upsetting, immoral, unwise, or simply too unknown for our good. The remedy to this perceived evil is usually a form of prohibition.  The offending innovation may be taxed severely or legislated to narrow purposes or restricted to the outskirts or banned altogether. (p. 240)

But banning technology never works, Kelly argues, largely because humans adapt and embrace new tools and developments. “[H]istory shows that it is very hard for a society as a whole to say no to technology for very long.” (p. 241) “Prohibitions are in effect postponements” and  “wholesale prohibitions simply do not work to eliminate a technology that is considered subversive or morally wrong.  Technologies can be postponed but not stopped.” (p. 243)

Importantly, Kelly doesn’t turn a blind eye to the downsides of technology.  In fact, he is refreshingly candid about the trade-offs we face:

If we examine technologies honestly, each one as its faults as well as its virtues. There are no technologies without vices and none that are neutral.  The consequences of a technology expand with its disruptive nature.  Powerful technologies will be powerful in both directions – for good and bad.  There is no powerfully constructive technology that is not also powerfully destructive in another direction, just as there is no great idea that cannot be greatly perverted for great harm. …  This should be the first law of technological expectation: The greater the promise of a new technology, the greater its potential for harm as well. (p. 246)

Quite right.  But then Kelly then goes on to masterfully discuss the dangers of applying the “precautionary principle” to technological advancement.  The “precautionary principle,” you will recall, basically states that since every technological advance poses a danger/risk, we should demand that proponents of that change prove no harm will come from it before allowing it to go forward.  The problem with that logic, Kelly correctly argues, is that because “every good produces harm somewhere… by the strict logic of an absolute Precautionary Principle no technologies would be permitted.” (p. 247-8)   Under such a regime, progress becomes impossible because trade-offs are considered unacceptable.  Of course, it doesn’t help that “when it comes to risk aversion, we are not rational,” Kelly notes. (p. 248).

In its effort to be “safe rather than sorry,” precaution becomes myopic. It tends of maximize only value: safety.  Safety trumps innovation.  The safest thing to so is to perfect what works and never try anything that could fail, because failure is inherently unsafe. … In general the Precautionary Principle is biased against anything new. (p. 249-50)

“This is exactly the wrong thing to do,” Kelly goes on to argue. “These technologies are inevitable. And they will cause some degree of harm. …  Yet their most important consequences — both positive and negative — won’t be visible for generations.” (p. 261)

Thus, we must learn to “count on uncertainty” and appreciate the benefits of ongoing experimentation and evolutionary dynamism.  “Even though we’ve learned to expect unintended consequences from every innovation, the particular unintended consequences are rarely foreseen,” he notes.  “Because of the inherent uncertainties in any model, laboratory, simulation, or test, the only reliable way to access a new technology is to let it run in place.” (p. 251)

This doesn’t mean humans shouldn’t try to foresee problems associated with new technologies or address them preemptively. But that can be done without resisting new technologies or technological change altogether. “The proper response to a lousy technology is not to stop technology or to produce no technology,” Kelly argues. “It is to develop a better, more convivial technology.” (p. 263)

Kelly’s formulation is remarkable similar to the “bad speech/more speech principle” from the field of First Amendment policy / jurisprudence.  That principle states that the best solution to the problem of bad speech (such as hate speech or seditious talk) is more speech to counter it instead of censorship.  That’s the same principle that Kelly wants us to embrace when it comes to technology: Don’t seek to ban or restrict it; find ways to embrace it, soften its blow, or counter it with new and better technology.

I think that’s a beautiful principle and I applaud Kevin Kelly’s formulation and defense of it.

Conclusion: Tear Out the Middle & You’ll Have a Great Book!

What, then, are we to make of the two books Kelly has penned here?   I remain extremely torn.  I feel that the opening and closing portions of What Technology Wants are almost too silly to be taken seriously.  Yet, the meat in the middle is absolutely beautiful, inspiring, and enlightening stuff.

So, here’s what you do:  Wait for the paperback version of What Technology Wants to be released (since it’ll be far easier for you to rip apart), and then tear off pages 1-70 and 270-360. What you’ll be left with is a terrific 200-page book that I can wholeheartedly recommend!

Despite its flaws, however, Kevin Kelly’s What Technology Wants is easily one of the most important information technology policy books of 2010 and will likely be in my top 3 when I compile my next “best info-tech books” list.  We’ll be talking about What Technology Wants for many years to come.

__________

[Note: In a forthcoming post, I’ll have a bit more to say about Kelly’s book and views as I contrast his thinking with Ted Kaczynski (aka, “the Unibomber”) and show how Kelly and Kaczynski help us define the extremes of the Internet optimist vs. pessimist spectrum. In the meantime, I encourage you to buy Kelly’s remarkably interesting book, visit his excellent website for more views and discussion about the book, and then listen to this podcast that Kelly did with our own Jerry Brito.  It’s a terrifically interesting conversation about a remarkably interesting book.]

  • Tim Wu

    Despite its flaws, I absolutely loved What Technology Wants. The man is deep, and writes from a kind of lived experience that is not easy to replicate.

  • Larry

    Adam, for your on-going optimist v. pessimist virtual debate, consider Star Trek: The Next Generation. The Federation represents technology optimism; the Borg represent technology pessimism.

  • http://www.netfamilynews.org Anne Collier

    Precisely what we're seeing in a subset of this debate, the online safety field, as you well know, Adam: “Over the centuries, societies have declared many technologies to be dangerous, economically upsetting, immoral, unwise, or simply too unknown for our good. The remedy to this perceived evil is usually a form of prohibition.” We're also seeing the myopia and, not surprisingly, given who the “beneficiaries” of precaution are, an even greater resistance to uncertainty. And what I see here, in this field, more than anywhere is that the precaution not only feeds on itself but, I would argue, increases young people's risk by banning social media at home and in school and leaving kids on their own right when and where, as USC Prof. Henry Jenkins argues, they could use some of the life literacy having adults in their experiences can provide. I agree with you and Kelly that we need not to try to ban or restrict social media but “find ways to embrace it, soften its blow, or counter it with new and better technology” and parenting and use of technology in education for the very reason that young people find it so compelling and are – according to Jenkins and other social-media researchers – learning so much in their use of it. I'm no utopian about kids & tech (I'm a parent of teens), but I'd like to see a little less fear of the uncertainty that combination represents – for my and all kids' sake.

  • Pingback: The 10 Most Important Info-Tech Policy Books of 2010

Previous post:

Next post: