Planning for Hypothetical Horribles in Tech Policy Debates

by on August 6, 2013 · 3 comments

do not panicIn a recent essay here “On the Line between Technology Ethics vs. Technology Policy,” I made the argument that “We cannot possibly plan for all the ‘bad butterfly-effects’ that might occur, and attempts to do so will result in significant sacrifices in terms of social and economic liberty.” It was a response to a problem I see at work in many tech policy debates today: With increasing regularity, scholars, activists, and policymakers are conjuring up a seemingly endless parade of horribles that will befall humanity unless “steps are taken” to preemptive head-off all the hypothetical harms they can imagine. (This week’s latest examples involve the two hottest technopanic topics du jour: the Internet of Things and commercial delivery drones. Fear and loathing, and plenty of “threat inflation,” are on vivid display.)

I’ve written about this phenomenon at even greater length in my recent law review article, “Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle,” as well as in two lengthy blog posts asking the questions, “Who Really Believes in ‘Permissionless Innovation’?” and “What Does It Mean to ‘Have a Conversation’ about a New Technology?” The key point I try to get across in those essays is that letting such “precautionary principle” thinking guide policy poses a serious threat to technological progress, economic entrepreneurialism, social adaptation, and long-run prosperity. If public policy is guided at every turn by the precautionary mindset then innovation becomes impossible because of fear of the unknown; hypothetical worst-case scenarios trump all other considerations. Social learning and economic opportunities become far less likely under such a regime. In practical terms, it means fewer services, lower quality goods, higher prices, diminished economic growth, and a decline in the overall standard of living.

Indeed, if we live in constant fear of the future and become paralyzed by every boogeyman scenario that our creative little heads can conjure up, then we’re bound to end up looking as silly as this classic 2005 parody from The Onion,Everything That Can Go Wrong Listed.” It joked that “A worldwide consortium of scientists, mathematicians, and philosophers is nearing the completion of the ambitious, decade-long project of cataloging everything that can go wrong.” The goal of the project was to create a “catalog of every possible unfortunate scenario” such that, “every hazardous possibility will be known to man.” Here was the hilarious fake snippet of the imaginary page 55,623 of the project:

snippet of Onion list of everything that can go wrong

I loved the story’s concluding quote from obviously fake Popular Science writer Brian Dyce, who said:

“Within a decade, laypeople might be able to log onto the Internet or go to their public library and consult volumes listing the myriad things that could go wrong,” Dyce said. “It could prove a very valuable research tool or preventative stopgap. For example, if you’re shopping for a car, you can prepare yourself by boning up on the 98,627 bad things that could happen during the purchasing process. This project could have deep repercussions on the way people make decisions, and also the amount of time they spend locked in their bedrooms.”

So, in the spirit of keeping people locked in their bedrooms, cowering in fear of hypothetical horribles, I have started a list of things we must all live in fear of and plan for! (I actually pulled most of these from articles and essays in my Evernote files that I tagged with the words “fear,” “panic.” and “dread.” I have collected more things than I can count.)  Anyway, please feel free to add your own suggestions down below in the comments.

  • Without beefed-up cybersecurity regulations, we’ll face an “electronic Pearl Harbor.”
  • Without pervasive NSA & law enforcement snooping, we face “the next 9/11.”
  • An unfiltered Internet experience will lead the next generation to become nymphomaniacs and sex-starved freaks.
  • Social networking sites are a “predators’ playground” where sex perverts prey on children.
  • Twitter and texting will lead to the end of reading and/or long-form writing.
  • Personalized digital services will lead to an online echo-chamber (“filter bubbles”) and potentially even the death of deliberative democracy.
  • Robots are going to take all our jobs and then turn us into their slaves.
  • 3D printing will destroy manufacturing jobs and innovation.
  • Strong crypto will just let the bad guys hide their secrets and nefarious plots from us.
  • Bitcoin will just lead to every teenager buying illegal drugs online.
  • Hackers will hijack my car’s electronic systems and force it to drive off a bridge with me inside.
  • Hackers are just going to remotely hack all those new medical devices I might use and give me a heart attack or aneurism.
  • Hackers are just going to remotely hack my home and all its “smart devices” and then shut down all my stuff or spy on me.
  • Geolocation technology is only going to empower perverts and stalkers to harass women.
  • Targeted online ads just brainwash us into buying things we don’t need and will lead to massive discrimination.
  • Big Data and the “quantified self” movement are just going to lead to massive social and economic discrimination.
  • Violent video games are teaching our kids to be killers and will lead to a massive spike in murders and violent crime.
  • Facebook is a “monopoly” and “public utility” from which there is no escape if you want to have an online existence.
  • Google Glass will mean everybody will just take pictures of me naked in the gym locker room.
  • Wearable technology will lead to a massive peer-to-peer Panopticon.
  • Commercial drones are going to fall from the sky and kill us (if they don’t zap us with lasers or death rays first).

Hey, it could all happen, right?!  Therefore, as The Onion proposed, we must “catalog every possible unfortunate scenario” such that “every hazardous possibility will be known to man” and then plan, plan, PLAN, P-L-A-N accordingly!

Alternatively, we could realize that, again and again, humans have shown the remarkable ability to gradually adapt to new technologies and assimilate them into their lives through trial-and-error experimentation, the evolution of norms, and the development of coping mechanisms. It’s called resiliency. It happens. We live, we learn, we move on.

Previous post:

Next post: