Ryan Radia brought to my attention this excellent Slate piece by Vaughan Bell entitled, “Don’t Touch That Dial! A History of Media Technology Scares, from the Printing Press to Facebook.” It touches on many of the themes I’ve discussed here in my essays on techno-panics, fears about information overload, and the broader optimists v. pessimist battle throughout history regarding the impact of new technologies on culture, life and learning. “These concerns stretch back to the birth of literacy itself,” Bell rightly notes:
Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain. From a historical perspective, what strikes home is not the evolution of these social concerns, but their similarity from one century to the next, to the point where they arrive anew with little having changed except the label.
Quite right. And Bell’s essay reminds us of this gem from the great Douglas Adams about how bad we humans are at putting technological change in perspective:
Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. Anything invented after you’re thirty-five is against the natural order of things.
So true, and I wish I would have remembered it before I wrapped up my discussion about “adventure windows” in the review of Jaron Lanier’s new book, You Are Not a Gadget, which I published last night. As I noted in that essay:
Our willingness to try new things and experiment with new forms of culture—our “adventure window”—fades rapidly after certain key points in life, as we gradually get set in our ways. Many cultural critics and average folk alike always seem to think the best days are behind us and the current good-for-nothing generation and their new-fangled gadgets and culture are garbage.
Perhaps most important is Bell’s indictment of the science—or complete lack thereof behind—the chicken-littleism:
These fears have also appeared in feature articles for more serious publications: Nicolas Carr’s influential article “Is Google Making Us Stupid?” for the Atlantic suggested the Internet was sapping our attention and stunting our reasoning; the Times of London article “Warning: brain overload” said digital technology is damaging our ability to empathize; and a piece in the New York Times titled “The Lure of Data: Is It Addictive?” raised the question of whether technology could be causing attention deficit disorder. All of these pieces have one thing in common—they mention not one study on how digital technology is affecting the mind and brain. They tell anecdotes about people who believe they can no longer concentrate, talk to scientists doing peripherally related work, and that’s it. Imagine if the situation in Afghanistan were discussed in a similar way. You could write 4,000 words for a major media outlet without ever mentioning a relevant fact about the war. Instead, you’d base your thesis on the opinions of your friends and the guy down the street who works in the kebab shop. He’s actually from Turkey, but it’s all the same, though, isn’t it? There is, in fact, a host of research that directly tackles these issues. To date, studies suggest there is no consistent evidence that the Internet causes mental problems. If anything, the data show that people who use social networking sites actually tend to have better offline social lives, while those who play computer games are better than nongamers at absorbing and reacting to information with no loss of accuracy or increased impulsiveness.