How our outdated privacy laws doomed contact tracing apps

0

The pandemic has taught us a lot: how vulnerable we are to uncontrolled disease. How divided we are politically, even when it comes to protecting our health. How we have taken for granted the eradication of past diseases, such as polio and smallpox. How much we appreciate the simple freedom of eating in a restaurant or browsing a store. How much we rely on interaction with friends and family for our daily happiness.

I’m a privacy lawyer, so one of the lessons I’ve learned from the pandemic is about privacy and the failure of contact tracing apps. Last spring, when the disease started to spread rapidly, these apps were touted as a promising way to control it by tracking diagnoses and exposure through self-reporting and location tracking. Around this time, Apple and Google announced a joint effort to develop technology that government health departments could use to create apps for their communities, “with user privacy and security at the heart of the design.” While these apps have had mixed success around the world, they have been a huge failure in the United States. Indeed, despite the early hopes and multiple efforts to implement these applications in various states and localities, Americans have largely rejected them and they have played a minimal role in controlling the disease.

One of the main reasons for this failure is that people don’t trust tech companies or the government to collect, use, and store their personal data, especially when that data relates to their health and precise location. While Apple and Google are committed to incorporating privacy measures into app design, including choice of enrollment, anonymity, usage limitations, and storing data only on Apple’s device a user, Americans just weren’t convinced. For example, a Washington Post / University of Maryland survey conducted shortly after the app was announced found that 50% of smartphone users would not use a contact tracing app even if it promised to rely on anonymous monitoring and reporting; 56% would not trust big tech companies to keep data anonymous; and 43% would not even trust public health agencies and universities to do it. In June, Americans’ mistrust had grown, with a new survey showing that 71% of those polled would not use contact tracing apps, with privacy cited as the main reason.

Privacy concerns weren’t the only reason these apps failed. As the experts predicted, they also failed for other reasons, including insufficient testing, unreliable self-reports, and the wide and rapid spread of the disease. However, Americans’ response to these apps shows that privacy now plays a critical role in their decision-making. Contrary to the long-held argument that people to say they care about privacy but act because they don’t (sometimes called the “privacy paradox”) Americans have refused to use these apps largely for privacy reasons. Privacy really mattered.

Americans had good reason to be wary of data collection by these apps. In recent years, they have repeatedly been victims of data breaches and other privacy breaches (including by big tech companies) too numerous to mention. In many cases, the privacy laws in this country have failed to protect them from such abuses, either because the abuses fell outside the limited scope of those laws, or because the laws imposed insufficient sanctions or other remedies. The same dangers presented themselves with contact tracing applications. Indeed, as readers of this blog are likely aware, the United States does not have a basic data protection law that would protect sensitive data obtained through these applications.

While the United States has laws that protect certain data in certain market sectors, these laws have limited application here. In fact, no US law to my knowledge would clearly require that all data collected through COVID tracking apps be stored and transmitted securely, used only for COVID tracking purposes, and safely disposed of when not. are no longer needed for this purpose. Without such protections, there can be no guarantee that such sensitive data will not end up in the hands of insurance companies, employers, creditors, identity thieves or stalkers, to be used in a manner that causes harm or to discriminate against individuals.

For example, the Health Insurance Portability and Accountability Act (HIPAA) provides certain protections for our medical information, but only if the data is collected and used by a “covered entity”, that is, a healthcare provider. health care such as a doctor or hospital or “business associate” helping with medical activities. This was not the case here, since state and local health departments were the ones collecting and using the data. Regardless, in April, the HHS had already announced that it was suspending HIPAA enforcement and sanctions for many covered entities engaged in “good faith” measures to combat COVID, making the Widely debatable question and suggesting that the HHS viewed its own health confidentiality rules as ill-equipped to deal with a public health emergency.

The other American laws are not doing much better. The FTC law allows the FTC to challenge, usually after the fact, “unfair or deceptive acts or practices” in commerce, including false statements regarding data privacy or security, or data practices that cause significant harm to consumers without compensatory benefits. While this law has arguably the broadest application of all applicable US privacy laws, it falls far short of providing the specific protections needed here: clear limits on how (and for how long) data. collected through COVID tracking apps can be used, stored and shared. Instead, in most cases, it allows businesses to decide for themselves what privacy protections to provide (or not), as long as they avoid deception and obvious forms of harm. Adding to the problem, the FTC law does not allow civil penalties (necessary to deter wrongdoing) except in limited cases.

If the Apps are used by citizens of particular states or localities, state or local laws may apply. However, a quick glance at the state’s main law (the California Consumer Privacy Act or CPPA) isn’t promising, as it doesn’t apply to government agencies that create and use these apps. Even if the CCPA did apply, a law that protects citizens in a single state barely provides the necessary privacy guarantees for widespread nationwide adoption of contact tracing applications.

Months after the start of the pandemic, Congress attempted to fill this legal void by enacting yet another narrow, situation-specific law. In May and June, when contact tracing applications had already been developed and deployed in certain localities, several senators hastened to circulate bills in order to regulate the applications and the sensitive data they collect. Some of these bills had serious flaws and loopholes, and none of them made any headway in Congress.

So what can we learn from this experience? The first is the obvious lesson, outlined above, that privacy concerns fueled public distrust of these apps and helped ensure their failure. For years, proponents of tough privacy laws have argued, often to industry-skeptical audiences, that strong protections are needed to maintain consumer confidence in the marketplace. Contact tracing is a concrete example.

What happened here also reminds us that clear standards governing the use of data should not only be seen as a detention, but also as a means of activate responsible uses of data, including emergency use of data. Properly crafted, a privacy law should govern and guide our day-to-day data practices, as well as how we use personal information to fight a pandemic.

Additionally, our experience here is a good illustration of the chaos and confusion we regularly face when trying to manage privacy in the United States – a patchwork of laws that leaves much of our data unprotected, uncertain about the laws. applicable, hasty efforts to close the gaps in the heat of the moment and erode public confidence.

Together, all of these lessons bring us back to the same conclusion that was the subject of my previous blog post: We need a basic federal privacy law to establish clear and enforceable privacy rules on the subject. whole market, a law that protects our personal information. in good times and in times of crisis.


Apple and Google are general and unlimited donors to the Brookings Institution. The findings, interpretations and conclusions of this article are solely those of the author and are not influenced by any donation.

Share.

Comments are closed.