Researchers, governments and tech companies around the world are racing to create mobile apps to track coronavirus exposure. Potentially dozens of these contact-tracing apps are under development or being debated across the countries of the world.
These apps typically follow either a centralized or decentralized approach, roughly corresponding to the level of government control over the apps and the different kinds of technology deployed on mobile phones.
Decentralized apps are best known by the joint Google and Apple API (sometimes referred to as “Gapple”) under development in the US, which will allow health agencies to develop their own apps. Another prominent model for decentralization is the Decentralized Privacy-Preserving Proximity Tracing (DP-3T) protocol, which European countries including Germany, Austria, Switzerland, Lithuania, Estonia, Finland and Ireland are developing.
The centralized apps are best illustrated by the UK’s contact-tracing app developed by the National Health Services (NHS) technology group NHSX (although a recent report says the UK is considering using the Apple-Google model). Australia’s COVIDSafe app was modeled on a similar approach in Singapore. China, which has required citizens to use location and health status tracking apps since February, stands out as a dominant example of centralized app use. Another centralized example is Israel, which used its state intelligence service’s phone tracking technology, usually reserved for tracking terrorists, to trace Israelis diagnosed with COVID-19.
Neither approach is inherently good or bad, although the centralized approach, which typically puts the app’s development and control in the hands of a central government, gets lower marks from a privacy and security perspective than the decentralized approach. “Centralized approaches give authorities access to valuable data for risk modeling and analysis, in order to help them understand how the virus appears to be spreading,” Future of Privacy Forum Counsel Polly Sanderson tells CSO.
“Probably by default the decentralized approach seems to be, at least on paper, better,” David Grout, CTO for EMEA at FireEye, tells CSO, especially if the centralized app is not secured or encrypted. Similarly, a poorly constructed decentralized app is not better than a well-constructed centralized app.
In terms of the technology division, “There are those who are trying to use Bluetooth to get proximity alerts, and there are those that are trying to go whole hog and collect location data using GPS, seeing how close you come to other people,” Allan Woodward, professor of cybersecurity at the University of Surrey, tells CSO. Typically, the centralized approaches under development collect data using GPS; some use GPS and also opt for the Bluetooth proximity alerts, which notify the app user when he has come into contact with someone who has tested positive for coronavirus during the previous two weeks.
“From that perspective, [a centralized approach is] obviously incredibly invasive in terms of privacy. Bluetooth is seen as less invasive,” Woodward says. Yet even with the less concerning Bluetooth approach, which can be done anonymously and usually requires data to be deleted after 14 days, privacy problems -- and heightened security needs -- can seep in depending on what data the app requires. “Even in the Bluetooth method, where people are asked to donate not only the facts and their identifiers when they test positive, but also for all those they came into contact with, that’s seen by some as a step too far because you can start to infer from that, start to build identity maps.”
Contact-tracing app critical issues
Whether centralized or decentralized, several fundamental critical issues surround the development of all coronavirus apps. These typically fall into four categories: interoperability, functionality, durability and utility.
Interoperability: As countries adopt and modify different protocols to achieve different purposes, it’s virtually guaranteed that the apps will not interoperate with one another. Researchers at FireEye have delineated eight different protocols that are used in current or planned tracing apps.
Functionality: Few countries have fully launched their apps, and even fewer have tested the apps before launch. For example, Australia’s CovidSAFE app doesn’t work well on iPhones. For Bluetooth exchanges to happen, an iPhone would need to be unlocked with the app running in the foreground. That stumbling block forced Germany to jump from the centralized Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) project to the decentralized DP-3T approach. Insiders say that the UK app’s trial has already revealed its code database is a “mess.”
Durability: Will these apps become permanent fixtures used for surveillance on citizens for other purposes long after the coronavirus crisis ends? “With the centralized model, there is...an inherent risk of mission creep whereby a government could always use the data for an unrelated purpose in the future,” Future of Privacy Forum’s Sanderson says. “This concern is heightened in jurisdictions which lack adequate data protection frameworks or adherence to fundamental human rights.”
This concern came to prominence in the UK last week when Matthew Gould, head of NHSX, informed Parliament's Human Rights Committee that UK citizens would not be able to delete the data collected by tracing app. He also said that the government would keep the data in a "pseudonymized" way for us in future research. “That’s where I start to get really concerned. That is mission creep of the worst kind,” Woodward says.
Utility: The question of whether any of the apps around the world be useful in stopping the spread of the coronavirus is unanswered. Part of the uncertainty stems from the timing of the apps' introductions. The apps are still in development, and the consensus holds that it will take at least one or two months at the minimum to reach users. Even then, public health experts say that penetration of the apps should be at 60% or higher to be effective in stopping the spread of the coronavirus. Unless the use of the apps becomes mandatory, unlikely except in China, it could take many months to reach this level of penetration. Singapore, for example, launched its app in April and has only attained 20% penetration of the population.
Contact tracing privacy safeguards
Most security experts advocate for legal and technical safeguards to keep app data from being used for other purposes. “You can imagine a future reason why this may have to stay on your phone, and it has the potential to act just like China and become a generalized tracking app,” Woodward says. “That’s why we’ve got to have legislative protection in place that includes sunset clauses that require, as soon as it expires, the whole lot evaporates.”
Few countries have legislative frameworks that govern their use or limit the parameters of the data collected, which could minimize the chance of mission creep. Both Australia and the UK are moving toward adopting privacy and other legislation that might limit the parameters of their apps.
In the US, Senators Roger Wicker (R-MS) and John Thune (R-SD) announced they would introduce the COVID-19 Consumer Data Protection Act. They say this bill will provide Americans with more transparency on how their data is being used and stored and hold business accountable to consumers if they use personal data in fighting COVID-19.
Legislation, however, is not the only government tool that can help stop mission creep. Sanderson points to an ethics board, established by the NHSX, creators of the UK’s app, as a viable path to creating desirable policy goals on the issues that feed into mission creep.
At the moment, it is unclear who in the US government is overseeing the development of coronavirus tracing apps, although the $2 trillion stimulus bill passed in late March gave the Centers for Disease Control (CDC) at least $500 million for a surveillance and data collection system. Unlike virtually all other countries, the US appears to be relying solely on the private sector in the form of the Google-Apple app to work with health authorities in devising tracing methods.
Contact tracing’s potential security risks
It’s too soon to tell what cybersecurity vulnerabilities these apps might present. With the possible exception of the Google-Apple API, the full source code for all the apps is still unavailable, something security researchers say they need for full transparency and analysis.
Moreover, most of the apps have yet to launch outside of India and Australia (and China), giving researchers little visibility into how they work. A few early reports by researchers regarding the COVIDSafe app in Australia and the Aarogya Setu app in India suggest major cybersecurity problems could lie ahead.
Australian developer Jim Mussared discovered issues in the COVIDSafe app that could allow a malicious person to track any user for an indefinite period of time. Mussared has informed the Australian health authorities of this flaw and other vulnerabilities he found. In India, ethical hacker Baptiste Robert, who goes by the name Elliot Alderson, discovered that Aarogya Setu is easily breached, although the Indian government denies his findings.
FireEye’s Grout says they’ve yet to spot any major cybersecurity problems surrounding these apps but say the main problems will only emerge once the apps are implemented. One topic on their minds is that bug bounties should be established to challenge and test the oncoming wave of apps. “It might be a fair assumption that the application providers be open to the challenge,” Grout says.