Last week the Federalist Society’s Regulatory Transparency Project released a podcast Adam and I recorded with FCC Chairman Pai:

Tech Roundup 9 – COVID-19 and the Internet: A Conversation with Ajit Pai

A few highlights: Chairman Pai’s legacy is still being written, but I suspect one of his lasting marks on the agency will be his integrating more economics and engineering in the FCC’s work.

He points out that that in recent decades, the FCC’s work has focused on the legal and policy aspects of telecommunications. My take: much of the dysfunctional legalism and regulatory arcana that’s built up in communications law is because Congress refuses to give the FCC a clean slate. Instead, communications laws have piled on to communications laws for 80 years. The regulatory thicket gives attorneys and insiders undue power in telecom policy. With the creation of the Office of Economics and Analytics and Engineering Honors program, Chairman Pai is creating institutions within the FCC to shift some expertise and resources to the economists and engineers.

We also discussed Marc Andreessen’s It’s Time to Build essay. A thought-provoking polemic (Adam has a response) that offers a challenge:

[T]o everyone around us, we should be asking the question, what are you building? What are you building directly, or helping other people to build, or teaching other people to build, or taking care of people who are building? If the work you’re doing isn’t either leading to something being built or taking care of people directly, we’ve failed you, and we need to get you into a position, an occupation, a career where you can contribute to building.

As we discuss in the podcast, the FCC has outperformed most public institutions on this front. The FCC in the past few years has untangled itself from the nonstop legal trench warfare of net neutrality regulation–an immense waste of time–to focus on making it faster and easier to build networks. As a result, the US is seeing impressive increases in network investment, coverage, and capacity relative to peer countries.

The COVID-19 crisis has been a stress test for the FCC and the broadband industry, and we’re grateful the Chairman took the time to discuss the agency, industry trends, and more with us.

Recently, a group of Republican senators announced they plan to introduce the COVID-19 Consumer Data Protection Act of 2020 to address privacy concerns related to contact-tracing and other pandemic-related apps. This new bill will reinvigorate many of the ongoing concerns regarding a potential federal data privacy framework.

Even before the bill has been officially introduced, it has faced criticism from some groups for failing to sufficiently protect consumers. But a more regulatory approach that might appear protective on the surface also has consequences. The European Union’s (EU) General Data Protection Regulation (GDPR) has made it more complex to develop compliant contact-tracing apps and to run charitable responses that might need personal information. Ideally, data privacy policy around the specific COVID-19 concerns should have enough certainty to enable innovative responses while preserving civil liberties. Policymakers should approach this policy area in a way that enables consumers to choose which options work best for their own privacy preferences and not dictate a one-size-fits-all set of privacy standards.

A quick review of the current landscape of the data privacy policy debate

Unlike the EU, the United States has taken an approach that only creates privacy regulation for specific types of data. Specific frameworks address those areas that consumers would likely consider the most sensitive and expect increased protection, such as financial information, health information, and children’s information. In general, this approach has allowed new and innovative uses of data to flourish.

Following various scandals and data breaches and the expansive regulatory requirements of the EU’s GDPR, policymakers, advocates, consumers, and tech companies have begun to question if the United States should follow Europe’s lead, or instead create a different federal data protection framework, or even maintain the status quo. In the absence of federal action, states such as California have passed their own data privacy laws. The California Consumer Privacy Act (CCPA) became effective in January (you may remember a flurry of emails notifying you of privacy policy changes) and is set to become enforceable July 1. The lack of a federal framework means, with various state laws, the United States could go from an innovation-enabling hands-off approach to a disruptive patchwork, creating confusion for both consumers and innovators. A patchwork means that some beneficial products might not be available in all states because of differing requirements or that the most restrictive parts of a state’s law might become the de facto rule. To avoid this scenario, a federal framework would provide certainty to innovators creating beneficial uses of data such as contact-tracing apps (and the consumers that use them) while also clarifying the redress and any necessary checks to prevent harm.

Questions of Enforcement in the Data Privacy Debate

One key roadblock in achieving a federal privacy framework whether is the question of how such rules should be enforced. Some of the early criticism of the potential COVID-19 data privacy bill has been about the anticipated lack of additional enforcement.

Often the choices for data privacy enforcement are portrayed as a false dichotomy between the status quo or an aggressive private right of action, with neither side willing to give way. In reality, as I discuss in a new primer, there are a wide range of options for potential enforcement. Policymakers should build on the advantages of the current flexible approach that has allowed American innovation to flourish. This also provides a key opportunity to improve the certainty for both innovators and consumers when it comes to new uses of data. More precautionary and regulatory approaches could increase the cost and discourage innovation by burdening innovative products with the need for pre-approval. Ideally, a policy framework should preserve consumers and innovators’ ability to make a wide range of privacy choices but still provides redress in the case of fraudulent claims or other wrongful action.

There are tradeoffs in all approaches. Current Federal Trade Commission (FTC) enforcement has led to concerns around the use of consent decrees and the need for clarity. A new agency to govern data privacy could be a massive expansion of the administrative state. State attorneys general might interpret and enforce federal privacy law differently if not given clear guidance from the FTC or Congress. A private right of action could deter not only potentially harmful innovation but prevent consumers from receiving beneficial products out of concerns about litigation risks. I discuss each of these options and tradeoffs in more detail in the new primer mentioned earlier.

Policymakers should look to the success of the current approach and modify and increase enforcement to improve that approach, rather than pursue other options that could lead to some of the more pronounced consequences of intervention.

Conclusion

As we are seeing play out during the current crisis, all privacy regulation inevitably comes with tradeoffs. We should be cautious of policies that presume that privacy should always be the preferred value and instead look to address the areas of harm while allowing a wide range of preferences. When it comes to questions of enforcement and other areas of privacy legislation, policymakers should look to preserve the benefits of the American approach that has given rise to a great deal of innovation that could not have been predicted or dictated.

I really liked this new essay, “Innovation is thriving in the fight against Covid-19,” by Norman Lewis over at Spiked, a UK-based publication. In it, he makes several important points similar to themes discussed in my book launch essay last week (“Evasive Entrepreneurialism and Technological Civil Disobedience in the Midst of a Pandemic.”) Lewis begins by noting that:

There is nothing like a crisis to concentrate the mind. And the Covid-19 catastrophe has certainly done this. It has speeded up latent trends and posed new questions. The issue of our technologically informed capacity to solve problems is just one example.

He continues on to argue:

a crisis like Covid-19 will necessarily pose new urgent questions that could not have been anticipated. New initiatives will rise to meet these. Pre-existing skills, knowledge, technologies and attitudes will always be the starting point of new problem-solving quests. Where and how we focus attention will, in part, be based on prior cultural assumptions and existing technologies, and also on the novelty of the problem to be solved.

Lewis discusses how innovative minds are pushing back against archaic regulatory barriers, business models and government regulations. As he nicely summarizes:

Unimagined solutions are being pushed while a more open attitude towards experimentation, risk-taking and side-stepping onerous and costly regulation is starting to emerge. Human needs are breaking down yesterday’s precautionary approaches.

That last line really resonated with me because it’s a major theme that runs throughout my new book, “Evasive Entrepreneurs and the Future of Governance: How Innovation Improves Economies and Governments.” As I summarized in my book launch essay:

Eventually, people take notice of how regulators and their rules encumber entrepreneurial activities, and they act to evade them when public welfare is undermined. Working around the system becomes inevitable when the permission society becomes so completely dysfunctional and counterproductive.

This was happening before the coronavirus outbreak, but the crisis has supercharged this phenomenon. Evasive entrepreneurs are taking advantage of the growth of new devices and platforms that let citizens circumvent (or perhaps just ignore) public policies that limit innovative efforts. These can include common tools like smartphones, computers, and various new interactive platforms, as well as more specialized technologies like cryptocurrencies, private drones, immersive technologies (like virtual reality), 3D printers, the “Internet of Things,” and sharing economy platforms and services. But that list just scratches the surface and the public is increasingly using these new technological capabilities to assert themselves and push back against laws and regulations that defy common sense and hold back progress.

Lawmakers and regulators need to consider a balanced response to evasive entrepreneurialism that is rooted in the realization that technology creators and users are less likely to seek to evade laws and regulations when public policies are more in line with common sense. Yesterday’s heavy-handed approaches that are rooted in the Precautionary Principle will need to be reformed to make sure progress can happen. 

Read my book to find out more!

 

[Co-authored with Walter Stover]

Artificial Intelligence (AI) systems have grown more prominent in both their use and their unintended effects. Just last month, LAPD announced that they would end their use of a predicting policing system known as PredPol, which had sustained criticism for reinforcing policing practices that disproportionately affect minorities. Such incidents of machine learning algorithms producing unintentionally biased outcomes have prompted calls for ‘ethical AI’. However, this approach focuses on technical fixes to AI, and ignores two crucial components of undesired outcomes: the subjectivity of data fed into and out of AI systems, and the interaction between actors who must interpret that data. When considering regulation on artificial intelligence, policymakers, companies, and other organizations using AI should therefore focus less on the algorithms and more on data and how it flows between actors to reduce risk of misdiagnosing AI systems. To be sure, applying an ethical AI framework is better than discounting ethics all together, but an approach that focuses on the interaction between human and data processes is a better foundation for AI policy.

The fundamental mistake underlying the ethical AI framework is that it treats biased outcomes as a purely technical problem. If this was true, then fixing the algorithm is an effective solution, because the outcome is purely defined by the tools applied. In the case of landing a man on the moon, for instance, we can tweak the telemetry of the rocket with well-defined physical principles until the man is on the moon. In the case of biased social outcomes, the problem is not well-defined. Who decides what an appropriate level of policing is for minorities? What sentence lengths are appropriate for which groups of individuals? What is an acceptable level of bias? An AI is simply a tool that transforms input data into output data, but it’s people that give meaning to data at both steps in context of their understanding of these questions and what appropriate measures of such outcomes are.

Continue reading →

Here’s yesterday’s full launch event video for the release of my new book, Evasive Entrepreneurs and the Future of Governance: How Innovation Improves Economies and Governments. My thanks to Matthew Feeney, Director of the Project on Emerging Technologies at the Cato Institute, for hosting the discussion and sorting through audience questions. The video is below and some of the topics we discussed are listed down below:

* innovation culture
* charter cities, innovation hubs & competitive federalism
* the pacing problem
* technological determinism
* innovation arbitrage
* existential risk
* the Precautionary Principle vs. Permissionless Innovation
* responsible innovation
* drones, facial recognition & surveillance tech
* why privacy & cybersecurity bills never pass
* regulatory accumulation
* applying Moore’s Law to government
* technological civil disobedience
* 3D printing
* biohacking & the “Right to Try” movement
* technologies of resistance
* “born free” technologies vs. “born in captivity” tech
* regulatory capture
* agency threats & “regulation by raised eyebrow”
* soft law vs. hard law
* autonomous systems & “killer robots”!

[Originally published on the Cato Institute blog.]

A pandemic is no time for bad governance. As the COVID-19 crisis intensified, bureaucrats and elected officials slumbered. Government regulations prevented many in the private sector from helping with response efforts. The result was a sudden surge of evasive entrepreneurialism and technological civil disobedience. With institutions and policies collapsing around them, many people took advantage of cutting‐​edge technological capabilities to evade public policies that were preventing practical solutions from emerging.

Examples were everywhere. Distilleries started producing hand sanitizers to address shortages while average folks began sharing do‐​it‐​yourself sanitizer recipes online. The Food and Drug Administration (FDA) looked to modify hand sanitizer guidelines quickly to allow for it, but few really cared because those rules weren’t going to stop them. Gray markets in face masks, medical face shields, and respirators developed. Some people and organizations worked together to make medical devices using off‐​the‐​shelf hardware and open source software. More simply, others just fired up sewing machines to make masks—and then, faced with an emerging public health consensus, the guidance from the federal government shifted dramatically: where formerly ordinary people were instructed not to buy or use masks, within a matter of days, the policy reversed, and all were encouraged to make and use cloth protective masks. Continue reading →

My latest book, Evasive Entrepreneurs and the Future of Governance How Innovation Improves Economies and Governments, is now live. Here’s the launch essay and online launch event. Also, here’s a summary of 10 major arguments advanced in the book. I will have more to say about the book in coming weeks, but here is a list of 13 key terms discussed in the text. This list appears at the end of the introduction to the book:

  1. Compliance paradox: The situation in which heightened legal or regulatory efforts fail to reverse unwanted behavior and instead lead to increased legal evasion and additional enforcement problems.
  2. Demosclerosis: Growing government dysfunction brought on by the inability of public institutions to adapt to change, especially technological change.
  3. Evasive entrepreneurs: Innovators who do not always conform to social or legal norms.
  4. Free innovation: Bottom-up, noncommercial forms of innovation that often take on an evasive character. Free innovation is sometimes called “grassroots” or “household” innovation or “social entrepreneurialism.” Even though it is typically noncommercial in character, free innovation often involves regulatory entrepreneurialism and technological civil disobedience.
  5. Innovation arbitrage: The movement of ideas, innovations, or operations to jurisdictions that provide legal and regulatory environments most hospitable to entrepreneurial activity. It can also be thought of as a form of jurisdictional shopping and can be facilitated by competitive federalism.
  6. Innovation culture: The various social and political attitudes and pronouncements toward innovation, technology, and entrepreneurial activities that, taken together, influence the innovative capacity of a culture or nation.
  7. Pacing problem: A term that generally refers to the inability of legal or regulatory regimes to keep up with the intensifying pace of technological change.
  8. Permissionless innovation: The general notion that “it’s easier to ask forgiveness than it is to get permission.” As a policy vision, it refers to the idea that experimentation with new technologies and innovations should generally be permitted by default.
  9. Precautionary principle: The practice of crafting public policies to control or limit innovations until their creators can prove that they will not cause any harm or disruptions.
  10. Regulatory entrepreneurs: Evasive entrepreneurs who set out to intentionally challenge and change the law through their innovative activities. In essence, policy change is part of their business model.
  11. Soft law: Informal, collaborative, and constantly evolving governance mechanisms that differ from hard law in that they lack the same degree of enforceability.
  12. Technological civil disobedience: The technologically enabled refusal of individuals, groups, or businesses to obey certain laws or regulations because they find them offensive, confusing, time-consuming, expensive, or perhaps just annoying and irrelevant.
  13. Technologies of freedom: Devices and platforms that let citizens openly defy (or perhaps just ignore) public policies that limit their liberty or freedom to innovate. Another term with the same meaning is “technologies of resistance.”

I’m pleased to announce that the Cato Institute has just published my latest book, Evasive Entrepreneurs and the Future of Governance How Innovation Improves Economies and Governments. Here’s my introductory launch essay about the book as well as the online launch event. And here’s a list of 13 key terms used throughout the book.

In coming days and weeks I will be occasionally blogging about different arguments made in the 368-page book, but here’s a quick summary of some of the key points I make in the book. These ten passages are pulled directly from the text:

  1. “the freedom to innovate is essential to human betterment for each of us individually and for civilization as a whole. That freedom deserves to be taken more seriously today.”
  2. “Entrepreneurialism and technological innovation are the fundamental drivers of economic growth and of the incredible advances in the everyday quality of life we have enjoyed over time. They are the key to expanding economic opportunities, choice, and mobility.”
  3. “Unfortunately, many barriers exist to expanding innovation opportunities and our entrepreneurial efforts to help ourselves, our loved ones, and others. Those barriers include occupational licensing rules, cronyism-based industrial protectionist schemes, inefficient tax schemes, and many other layers of regulatory red tape at the federal, state, and local levels. We should not be surprised, therefore, when citizens take advantage of new technological capabilities to evade some of those barriers in pursuit of their right to earn a living, to tinker with or try doing new things, or just to learn about the world and serve it better.”
  4. “Evasive entrepreneurs rely on a strategy of permissionless innovation in both the business world and the political arena. They push back against ‘the Permission Society,’ or the convoluted labyrinth of permits and red tape that often encumber entrepreneurial activities.” 
  5. “We should be willing to tolerate a certain amount of such outside-the-box thinking because entrepreneurialism expands opportunities for human betterment by constantly replenishing the well of important, life-enhancing ideas and applications.”
  6. “we should better appreciate how creative acts and the innovations they give rise to can help us improve government by keeping public policies fresh, sensible, and in line with common sense and the consent of the governed.”
  7. “Evasive entrepreneurialism is not so much about evading law altogether as it is about trying to get interesting things done, demonstrating a social or an economic need for new innovations in the process, and then creating positive leverage for better results when politics inevitably becomes part of the story. By acting as entrepreneurs in the political arena, innovators expand opportunities for themselves and for the public more generally, which would not have been likely if they had done things by the book.”
  8. “Dissenting through innovation can help make public officials more responsive to the people by reining in the excesses of the administrative state, making government more transparent and accountable, and ensuring that our civil rights and economic liberties are respected.”
  9. “In an age when many of the constitutional limitations on government power are being ignored or unenforced, innovation itself can act as a powerful check on the power of the state and can help serve as a protector of important human liberties.”
  10. “Lawmakers and regulators need to consider a balanced response to evasive entrepreneurialism that is rooted in the realization that technology creators and users are less likely to seek to evade laws and regulations when public policies are more in line with common sense.”

Continue reading →

[First published by AIER on April 20, 2020 as “Innovation and the Trouble with the Precautionary Principle.”]

In a much-circulated new essay (“It’s Time to Build”), Marc Andreessen has penned a powerful paean to the importance of building. He says the COVID crisis has awakened us to the reality that America is no longer the bastion of entrepreneurial creativity it once was. “Part of the problem is clearlyforesight, a failure of imagination,” he argues. “But the other part of the problem is what we didn’t do in advance, and what we’re failing to do now. And that is a failure of action, and specifically our widespread inability to build.”The Mind of Marc Andreessen | The New Yorker

Andreessen suggests that, somewhere along the line, something changed in the DNA of the American people and they essentially stopped having the desire to build as they once did. “You don’t just see this smug complacency, this satisfaction with the status quo and the unwillingness to build, in the pandemic, or in healthcare generally,” he says. “You see it throughout Western life, and specifically throughout American life.” He continues:

“The problem is desire. We need to want these things. The problem is inertia. We need to want these things more than we want to prevent these things. The problem is regulatory capture. We need to want new companies to build these things, even if incumbents don’t like it, even if only to force the incumbents to build these things.”

Accordingly, Andreessen continues on to make the case to both the political right and left to change their thinking about building more generally. “It’s time for full-throated, unapologetic, uncompromised political support from the right for aggressive investment in new products, in new industries, in new factories, in new science, in big leaps forward.”

What’s missing in Andreessen’s manifesto is a concrete connection between America’s apparent dwindling desire to build these things and the political realities on the ground that contribute to that problem. Put simply, policy influences attitudes. More specifically, policies that frown upon entrepreneurial risk-taking actively disincentivize the building of new and better things. Thus, to correct the problem Andreessen identifies, it is essential that we must first remove political barriers to productive entrepreneurialism or else we will never get back to being the builders we once were.     Continue reading →

The recently-passed CARES Act included $500 million for the CDC to develop a new “surveillance and data-collection system” to monitor the spread of COVID-19.

There’s a fierce debate about how to use technology for health surveillance for the COVID-19 crisis. Unfortunately this debate is happening in realtime as governments and tech companies try to reduce infection and death while complying with national laws and norms related to privacy.

Technology has helped during the crisis and saved lives. Social media, chat apps, and online forums allow doctors, public health officials, manufacturers, entrepreneurs, and regulators around the world to compare notes and share best practices. Broadband networks, Zoom, streaming media, and gaming make stay-at-home order much more pleasant and keeps millions of Americans at work, remotely. Telehealth apps allow doctors to safely view patients with symptoms. Finally, grocery and parcel delivery from Amazon, Grubhub, and other app companies keep pantries full and serve as a lifeline to many restaurants.

The great tech successes here, however, will be harder to replicate for contact tracing and public health surveillance. Even the countries that had the tech infrastructure somewhat in place for contact tracing and public health surveillance are finding it hard to scale. Privacy issues are also significant obstacles. (On the Truth on the Market blog, FTC Commissioner Christine Wilson provides a great survey of how other countries are using technology for public health and analysis of privacy considerations. Bronwyn Howell also has a good post on the topic.) Let’s examine some of the strengths and weaknesses of the technologies.

Cell tower location information

Personal smartphones typically connect to the nearest cell tower, so a cell networks record (roughly) where a smartphone is at a particular time. Mobile carriers are sharing aggregated cell tower data with public health officials in Austria, Germany, and Italy for mobility information.

This data is better than nothing for estimating district- or region-wide stay-at-home compliance but the geolocation is imprecise (to the half-mile or so). 

Cell tower data could be used to enforce a virtual geofence on quarantined people. This data is, for instance, used in Taiwan to enforce quarantines. If you leave a geofenced area, public health officials receive an automated notification of your leaving home.

Assessment: Ubiquitous, scalable. But: rarely useful and virtually useless for contact tracing.

GPS-based apps and bracelets

Many smartphone apps passively transmit precise GPS location to app companies at all hours of the day. Google and Apple have anonymized and aggregated this kind of information in order to assess stay-at-home order effects on mobility. Facebook reportedly is also sharing similar location data with public health officials.

As Trace Mitchell and I pointed out in Mercatus and National Review publications, this information is imperfect but could be combined with infection data to categorize neighborhoods or counties as high-risk or low-risk. 

GPS data, before it’s aggregated by the app companies for public view, reveals precisely where people are (within meters). Individual data is a goldmine for governments, but public health officials will have a hard time convincing Americans, tech companies, and judges they can be trusted with the data.

It’s an easier lift in other countries where trust in government is higher and central governments are more powerful. Precise geolocation could be used to enforce quarantines.

Hong Kong, for instance, has used GPS wristbands to enforce some quarantines. Tens of thousands of Polish residents in quarantines must download a geolocation-based app and check in, which allows authorities to enforce quarantine restrictions. It appears the most people support the initiative.

Finally, in Iceland, one third of citizens have voluntarily downloaded a geolocation app to assist public officials in contact tracing. Public health officials call or message people when geolocation records indicate previous proximity with an infected person. WSJ journalists reported on April 9 that:

If there is no response, they send a police squad car to the person’s house. The potentially infected person must remain in quarantine for 14 days and risk a fine of up to 250,000 Icelandic kronur ($1,750) if they break it.

That said, there are probably scattered examples of US officials using GPS for quarantines. Local officials in Louisville, Kentucky, for example, are requiring some COVID-19-positive or exposed people to wear GPS ankle monitors to enforce quarantine.

Assessment: Aggregated geolocation information is possibly useful for assessing regional stay-at-home norms. Individual geolocation information is not precise enough for effective contact tracing. It’s probably precise and effective for quarantine enforcement. But: individual geolocation is invasive and, if not volunteered by app companies or users, raises significant constitutional issues in the US.

Bluetooth apps

Many researchers and nations are working on or have released some type of Bluetooth app for contact tracing. This includes Singapore, the Czech Republic, Britain, Germany, Italy and New Zealand.  

For people who use these apps, Bluetooth runs in the background, recording other Bluetooth users nearby. Since Bluetooth is a low-power wireless technology, it really only can “see” other users within a few meters. If you use the app for awhile and later test positive for infection, you can register your diagnosis. The app will then notify (anonymously) everyone else using the app, and public health officials in some countries, who you came in contact with in the past several days. My colleague Andrea O’Sullivan wrote a great piece in Reason about contact tracing using Bluetooth.

These apps have benefits over other forms of public health tech surveillance: they are more precise than geolocation information and they are voluntary.

The problem is that, unlike geolocation apps, which have nearly 100% penetration with smartphone users, Bluetooth contact tracing apps have about 0% penetration in the US today. Further, these app creators, even governments, don’t seem to have the PR machine to gain meaningful public adoption. In Singapore, for instance, adoption is reportedly only 12% of the population, which is way too low to be very helpful.

A handful of institutions in the world could get appreciable use of Bluetooth contact tracing: telecom and tech companies have big ad budgets and they own the digital real estate on our smartphones.

Which is why the news that Google and Apple are working on a contact tracing app is noteworthy. They have the budget and ability to make their hundreds of millions of Android and iOS users aware of the contact tracing app. They could even go so far as push a notification to the home screen to all users encouraging them to use it.

However, I suspect they won’t push it hard. It would raise alarm bells with many users. Further, as Dan Grover stated a few weeks ago about why US tech companies haven’t been as active as Chinese tech companies in using apps to improve public education and norms related to COVID-19:

Since the post-2016 “techlash”, tech companies in Silicon Valley have acted with a sometimes suffocating sense of caution and unease about their power in the world. They are extremely careful to not do anything that would set off either party or anyone with ideas about regulation. And they seldom use their pixel real estate towards affecting political change.

[Ed.: their puzzling advocacy of Title II “net neutrality” regulation a big exception].

Techlash aside, presumably US companies also aren’t receiving the government pressure Chinese companies are receiving to push public health surveillance apps and information. [Ed.: Bloomberg reports that France and EU officials want the Google-Apple app to relay contact tracing notices to public health officials, not merely to affected users. HT Eli Dourado]

Like most people, I have mixed feelings about how coercive the state and how pushy tech companies should be during this pandemic. A big problem is that we still have only an inkling about how deadly COVID-19 is, how quickly it spreads, and how damaging stay-at-home rules and norms are for the economy. Further, contact-tracing apps still need extensive, laborious home visits and follow-up from public health officials to be effective–something the US has shown little ability to do.

There are other social costs to widespread tech-enabled tracing. Tyler Cowen points out in Bloomberg that contact tracing tech is likely inevitable, but that would leave behind those without smartphones. That’s true, and a major problem for the over-70 crowd, who lack smartphones as a group and are most vulnerable to COVID-19.

Because I predict that Apple and Google won’t push the app hard and I doubt there will be mandates from federal or state officials, I think there’s only a small chance (less than 15%) a contact tracing wireless technology will gain ubiquitous adoption this year (60% penetration, more than 200 million US smartphone users). 

Assessment: A Bluetooth app could protect privacy while, if volunteered, giving public health officials useful information for contact tracing. However, absent aggressive pushes from governments or tech companies, it’s unlikely there will be enough users to significantly help.

Health Passport

The chances of mass Bluetooth app use would increase if the underlying tech or API is used to create a “health passport” or “immunity passport”–a near-realtime medical certification that someone will not infect others. Politico reported on April 10 that Dr. Anthony Fauci, the White House point man on the pandemic, said the immunity passport idea “has merit.”

It’s not clear what limits Apple and Google will put on their API but most APIs can be customized by other businesses and users. The Bluetooth app and API could feed into a health passport app, showing at a glance whether you are infected or you’d been near someone infected recently.

For the venues like churches and gyms and operators like airlines and cruise ships that need high trust from participants and customers, on the spot testing via blood test or temperature taking or Bluetooth app will likely gain traction. 

There are the beginnings of a health passport in China with QR codes and individual risk classifications from public health officials. Particularly for airlines, which is a favored industry in most nations, there could be public pressure and widespread adoption of a digital health passport. Emirates Airlines and the Dubai Health Authority, for instance, last week required all passengers on a flight to Tunisia to take a COVID-19 blood test before boarding. Results came in 10 minutes.

Assessment: A health passport integrates several types of data into a single interface. The complexity makes widespread use unlikely but it could gain voluntary adoption by certain industries and populations (business travelers, tourists, nursing home residents).

Conclusion

In short, tech could help with quarantine enforcement and contact tracing, but there are thorny questions of privacy norms and it’s not clear US health officials have the ability to do the home visits and phone calls to detect spread and enforce quarantines. All of these technologies have issues (privacy or penetration or testing) and there are many unknowns about transmission and risk. The question is how far tech companies, federal and state law officials, the American public, and judges are prepared to go.