As COVID-19 began to spread last spring, apps were developed to track cellphone signals and other data so that people in close proximity to those infected could be notified and updated. in quarantine. The novel coronavirus quickly overtook these efforts, becoming so widespread that individual exposure tracing could not contain it.
But the issues raised by digital contact tracing — around privacy, efficiency and fairness — have yet to be resolved, says Susan Landau, cybersecurity expert at Tufts. Indeed, a future public health crisis is likely to inspire calls to collect such information again.
“We have this infrastructure now, and there will be another pandemic,” says Landau, Bridge Professor of Cybersecurity and Policy at the Fletcher School and Tufts School of Engineering. “When another highly infectious respiratory disease begins to spread, we need to know how to design apps in such a way that their use is efficient, improves medical fairness and, of course, protects privacy.”
Landau discusses the challenges of digital contact tracing in his new book, People Matter: Contact Tracing Apps and Public Health.
clumps now spoke with her to understand how technology could be better deployed to protect public health during a pandemic, while ensuring privacy. Here are five takeaways from the conversation.
The limitations of current technology must be recognized. As smart as they are, smartphones and their signals still can’t give us the information we might want. For example, the short-range Bluetooth signals used in some COVID-19 tracking apps may be affected by the environment in which the app is operating.
When such an application was tested in a simulation of a subway car, with people sitting next to each other in a large gymnasium, Bluetooth radio signals traveled as expected and could be used to identify people. close to each other.
But when researchers in Dublin tested the apps on real trams, they found the signals bounced off the metal connectors between the cars, making them more robust and providing unreliable information about people’s actual proximity, Landau says.
Worse still, these digital methods for measuring proximity don’t capture other important information, like whether there’s a wall between two people, whether they’re indoors or outdoors, or whether they’re sitting quietly. wearing masks or singing and shouting without a mask, Landau says. Such imprecision limits their effectiveness.
Additionally, if there is no widespread random testing of the population, no form of contact tracing will identify exposure to asymptomatic carriers who do not realize they are infected. In the case of COVID-19, people without symptoms have been responsible for much of the transmission.
Storage and sharing of location data and centralized collection of information should be avoided. In South Korea last year, a public database started sharing the gender, age and location of people who were diagnosed with COVID, and it became very easy to identify people, explains Landau. Tracking your location can convey a lot of private information, such as whether you go to church when AA meetings are held or whether you spend weekends in an area known for its LGBTQ nightlife.
Centralizing data collection with a government agency can also be problematic. During the current pandemic, information collected in the name of protecting public health has been shared with other agencies, such as the police, in some countries, Landau says. She argues this shouldn’t be done, in part because it will reduce people’s willingness to use the apps.
Apple, Google and a group of cryptographers in Europe and the United States have worked with epidemiologists to create systems that protect privacy and are decentralized. The Google Apple Exposure Notification system, for example, does not collect location information or share identifying information with centralized authorities.
Instead, it tracks proximity and alerts users if they were close to someone with COVID, without identifying the infected person or the location of the exposure. Landau prefers these approaches, which are the only types of contact tracing apps allowed in the European Union.
What happens after you learn you’ve been exposed to a disease can be just as important as the notification itself. While apps that provide decentralized exposure notification are great for protecting privacy, they don’t rely on contact tracers and thus lose some of the personal touch and social support that is central to research. of traditional contacts carried out by people with extensive training, explains Landau.
Before even asking where an infected person has been and with whom, these contact tracers ask questions like, “How can I help you? Do you need food delivered? Do you have someone who can take care of you?
That said, decentralized and privacy-protecting applications can be associated with such mediums. In Switzerland, for example, if the app says you’re exposed and the health ministry agrees that you have to quarantine, “you get financial help to stay at home,” Landau says.
In Ireland, an application lets you choose to give your phone number. “If you give your phone number, of course, you’re not anonymous, but if you’re exposed, you’re called by contact tracers,” who can help you figure out next steps, Landau says.
Future apps should consider the benefits of offering such support, rather than leaving people alone with information about potential exposures, she argues.
Adoption of apps should be voluntary, the information collected should only be used to protect public health, and the technology should be transparent. Coercion should have no part in contact tracing, Landau says, and information should never be shared with other government agencies. Governments should invest in evaluation before and during the deployment of contact tracing apps in various communities to ensure that they serve the entire population. And the audience should be able to understand how the apps work.
“You have to be careful not to deploy resources that help communities that are already doing well and don’t help other communities,” she says.
Trust is essential for contact tracing to work. Undocumented immigrants may be reluctant to share their information for fear of government prosecution. Black Americans, whose communities have been abused by the medical establishment and who have been heavily monitored, may also be more suspicious of authorities than other groups, Landau says. “App adoption will not be the same in these different communities,” she says. “Cultural issues need to be taken into account.”
The first step to building trust is to protect user security, according to Landau. “False positives don’t protect user safety, especially in poor communities,” she says. “They tell people they can’t go to work when in fact they are fine.” To increase trust in an app, notification of potential exposure should be combined with free and easy-to-access testing, she says.
The IT people who developed the privacy apps “did a very good thing by moving the conversation from centralized apps to decentralized apps,” Landau says. “But what they didn’t do, because they didn’t have enough public health people in the room, was ask: are we solving the right problem?”
The biggest issue, she says, is about fairness. This not only includes factors like who owns the latest smartphones, but also issues like how difficult it is to get an appointment for a COVID-19 test or vaccine — and, says Landau, how much outreach is being done in communities. who are hesitant to be vaccinated, to address concerns and help people make informed decisions.
“Cryptographers should be congratulated for what they’ve done, but it’s really a lesson for us,” she says. “If you want to solve a public health problem or a social problem, you really need to involve the public health experts earlier to understand the issues.”