Existential Risk & Emerging Technology Governance

by on August 5, 2020 · 0 comments

“The world should think better about catastrophic and existential risks.” So says a new feature essay in The Economist. Indeed it should, and that includes existential risks associated with emerging technologies.

The primary focus of my research these days revolves around broad-based governance trends for emerging technologies. In particular, I have spent the last few years attempting to better understand how and why “soft law” techniques have been tapped to fill governance gaps. As I noted in this recent post compiling my recent writing on the topic;

soft law refers to informal, collaborative, and constantly evolving governance mechanisms that differ from hard law in that they lack the same degree of enforceability. Soft law builds upon and operates in the shadow of hard law. But soft law lacks the same degree of formality that hard law possess. Despite many shortcomings and criticisms, compared with hard law, soft law can be more rapidly and flexibly adapted to suit new circumstances and address complex technological governance challenges. This is why many regulatory agencies are tapping soft law methods to address shortcomings in the traditional hard law governance systems.

I argued in recent law review articles as well as my latest book, despite its imperfections, I believe that soft law has an important role to play in filling governance gaps that hard law struggles to address. But there are some instances where soft law simply will not cut it. As I noted in Chapter 7 of my new book, there may be very legitimate existential threats out there that we should be spending more time addressing because the scope, severity, and probability of severe risk are present. Hard law solutions will still be needed in such instances, even if they may be challenged by many of the same factors that are fueling the shift toward soft law for other sectors or issues.

Of course, we are immediately confronted with a definitional challenge: What exactly counts as an “existential risk”? I argue that it is important that we spend more time discussing this question because far too many people today throw around the term “existential risk” when referencing risks that are noting of the sort. For example, increased social media use may indeed be a threat to data security and personal privacy, but those risks are not “existential” in the same way chemical or nuclear weapons proliferation are threats to our existence. This gets to the heart of the matter: the root of “existential” is existence. By definition, an existential risk needs to have some direct bearing on the future of humanity’s ability to survive. Efforts to conflate lesser risks into existential ones cheapen the very meaning of the term.

This shouldn’t be controversial, but somehow it is. Countless pundits today want to suggest that almost every new technological development might somehow pose an existential threat to humanity. But it just isn’t the case. That does not mean their concerns are not important, or potentially deserving of some government attention. It simply means that we need to take risk prioritization more seriously. If everything is an existential risk, than nothing is an existential risk. We must have some sort of ranking of risks if we hope to have a rational conversation about how to use scare societal resources to address matters of public concern.

These issues are discussed at far greater length in the sections of my book (pgs. 228-240) that you will find embedded down below. How should society deal with “killer robots” or the accelerated development of genetic editing capabilities? What kind of coordinated compliance regime might help address rouge actors who seek to use new technological capabilities for nefarious purposes? What can we learn from past global enforcement efforts for chemical and nuclear weapons? These are just some of the questions I take on in this section of the book and plan to spend more time addressing in coming years. Scan these pages from the book to see my initial thoughts on these matters. But I am really just scratching the surface here. I’ll have much more to say on these matters in coming months and years. It’s a massively complicated topic.

Previous post:

Next post: