Driverless Cars, Privacy & Security: Event Video & Talking Points

by on October 20, 2014 · 0 comments

Last week, it was my pleasure to speak at a Cato Institute event on “The End of Transit and the Beginning of the New Mobility: Policy Implications of Self-Driving Cars.” I followed Cato Institute Senior Fellow Randal O’Toole and Marc Scribner, a Research Fellow at the Competitive Enterprise Institute. They provided a broad and quite excellent overview of all the major issues at play in the debate over driverless cars. I highly recommend you read the excellent papers that Randal and Marc have published on these issues.

My role on the panel was to do a deeper dive into the privacy and security implications of not just the autonomous vehicles of our future, but also the intelligent vehicle technologies of the present. I discussed these issues in greater detail in my recent Mercatus Center working paper, “Removing Roadblocks to Intelligent Vehicles and Driverless Cars,” which was co-authored with Ryan Hagemann. (That article will appear in a forthcoming edition of the Wake Forest Journal of Law & Policy.)  I’ve embedded the video of the event down below (my remarks begin at the 38:15 mark) as well as my speaking notes. Again, please consult the longer paper for details.


The privacy & security implications of self-driving cars are already driving public policy concerns because of the amount of data they collect. Here are a few things we should keep in mind as we consider new regulations for these technologies:

1)      Security & privacy are relative concepts with amorphous boundaries

  • Not everyone affixes the same value on security & privacy; very subjective
  • Some people are hyper-cautious about security or hyper-sensitive about their privacy; others are risk-takers or are just somewhat indifferent (or pragmatic) about these things

2)      Security & privacy norms can and often do evolve very rapidly over time

  • With highly disruptive technologies, we tend to panic first but then when move to a new plateau with new ethical and legal baselines
  • [I’ve written about this in my recent law review articles on about privacy and security]
  • The familiar cycle at work: initial resistance, gradual adaptation, eventual assimilation
  • This was true for the first cars a century ago; true today as well

3)      For almost every perceived privacy or security harm, there is a corresponding consumer benefit that may outweigh the feared harm

  • We see this reality at work with the broader Internet & we will see it at work with intelligent vehicles
  • Ex: Compare vehicle telematics to locational tracking technologies for smartphones
  • In both contexts, locational tracking raises rather obvious privacy considerations
  • But has many benefits and could not exist without them (traffic)
  • “tracking” concerns may dissipate for cars like smartphones (but not evaporate!)

4)      As it pertains to intelligent vehicle technologies, today’s security & privacy concerns are not the same as yesterdays and they will not be the same as tomorrow’s either.

  • Today’s “intelligent vehicle” technology privacy issues may be more concerning that tomorrow’s for fully autonomous vehicles
  • today’s on-baord EDRs & telematics may cause more privacy concerns for us as drivers than tomorrow’s technologies
  • ex: concerns about tailored insurance & automated law enforcement
  • That may lead to some privacy concerns in the short-term (or fears of “discrimination”)
  • BUT… What happens when cars are no longer a final good but merely a service for hire? (i.e., What happens when we combine Sharing Economy w/ self-driving cars?)
  • Car of future = robotic chauffeur (like Uber + Zip Car)
  • Old privacy concerns will evolve rapidly; security likely to become bigger concern

5)      Any security & privacy solutions must take these realities into account in order to be successful and those solutions must also accommodate the need to balance many different values and interests simultaneously.

  • There are no silver bullet solutions to privacy & security problems
  • + it will be difficult for law to keep up with pace of innovations
  • Therefore, We need a flexible, “layered approach” with many different solutions

we need “simple rules for a complex world” (Richard Epstein) 

  • Contracts / enforce Terms of Service
  • Common law / torts / products liability
  • see excellent new Brookings paper by John Villasenor: “when confronted with new, often complex, questions involving products liability, courts have generally gotten things right. . . . Products liability law has been highly adaptive to the many new technologies that have emerged in recent decades, and it will be quite capable of adapting to emerging autonomous vehicle technologies as the need arises.”
  • liability norms & insurance standards will evolve rapidly as cars move from final good to service
  • “least-cost avoider” implications (the more you know, the more responsible you become)

Privacy & Security “by design” (“Baking-in” best practices)

  • Data collection minimization
  • Limit sharing w 3rd parties
  • Transparency about all data collection and use practices
  • Clear consent for new uses
  • see Future of Privacy Forum best practices for intelligent vehicle tech providers
  • this is already happening (GAO report noted 10 smart car tech makers already doing so)
  • Hopefully some firms compete on privacy & exceed these standards for those who want it
  • And hopefully privacy & security advocates develop tools to better safeguard these values, again for those who want more protection

 Query: But shouldn’t there be some minimal standards? Federal or state regulation?

  • Things moving too quick; hard for law to keep pace w/o limiting innovation opportunities
  • The flexible approach and methods I just listed are better suited to evolve with the cases and controversies that pop up along the way
  • it is better to utilize a “wait and see” strategy & see if serious & persistent problems develop that require regulatory remedies; but don’t lead with preemptive, precautionary controls
  • permissionless innovation” should remain our default policy position
  • Ongoing experimentation should be permitted not just with technology in general, but also with privacy and security solutions and standards
  • In sum… avoid One Size Fits All solutions

6)      Special consideration should be paid to government actions that affect user privacy

  • Whereas many of the privacy and security concerns involving private data collection can be handled using the methods discussed previously, governmental data collection raises different issues
  • Private entities cannot fine, tax, or imprison us since they lack the coercive powers governments possess.
  • Moreover, although it is possible to ignore or refuse to be a part of various private services, the same is not true for governments, whose grasp cannot be evaded.
  • Thus, special protections are needed for law enforcement agencies and officials as it pertains to these technologies.
  • When government seeks access to privately-held data collected from these technologies, strong constitutional and statutory protections should apply.
  • We need stronger 4th Amendment constraints
  • Courts should revisit the “third-party doctrine,” which holds that an individual sacrifices their Fourth Amendment interest in their personal information when they divulges it to a third party, even if that party has promised to safeguard that data.


Previous post:

Next post: