Google and Apple are co-creating a contact-tracing tool. Can you trust it?

The tool has been built with several privacy safeguards, but are they enough to protect individual privacy?
Privacy news
3 mins
Two phones on a map with lines intersecting from them.

In what can only be described as the most unlikely of partnerships, Apple and Google have teamed up to develop a contact-tracing tool that alerts people who have been in contact with someone with Covid-19.

This development comes amid the growing adoption by governments of facial recognition and phone tracking technology to try and contain the spread of the pandemic.

How contact-tracing works

Contact-tracing is the process of identifying and isolating anyone who has come into contact with someone carrying a pathogen, e.g., Covid-19. Normally this means someone who has tested positive reports where they’ve been and who they’ve been in contact with recently to a health authority, which in turn alerts those people, who then proceed to the next step, which is either self-isolation or some variation of it.

The process, digitized through an app, has proved useful in cutting transmission of the virus in East Asian territories like Singapore, South Korea, and Taiwan, which have voluntarily adopted contact-tracing apps and tools.

[Get the latest news in security and technology. Sign up for the ExpressVPN blog newsletter.]

Singapore’s app, TraceTogether, has been influential in the creation of Google and Apple’s tool, which is set to come out in mid-May. Google and Apple’s whitepaper outlines a scenario to show how the tool will work.

Diagram of how Google and Apple's contact tracing tool works.

Like TraceTogether, Apple and Google’s tool will use Bluetooth to approximate your distance to other phones that have the app, meaning that it does not collect GPS or Wi-Fi data. The devices exchange encrypted, anonymous IDs, which are stored locally on the device and can only be decrypted by a public-health authority (such as a hospital) if you consent to it after testing positive. The IDs also change every 15 minutes so the data is harder to de-anonymize.

Diagram of how Google and Apple's contact tracing tool works.

Once decrypted, the hospital activates the contact tracing app which then sends a notification to all the IDs stored on that person’s phone. The notification is a general statement saying you’ve been close to someone with Covid-19 and possibly exposed to it. The recipient of the notification can choose how to respond to this information.

Privacy safeguards are in place but insufficient to win trust

To their credit, Google and Apple have put in the effort to make this system private. For one thing, it is a completely voluntary process—you can opt in and leave anytime.

They’ve also attempted to make the tool as anonymous and decentralized as possible. As outlined in their cryptographic specifications, the tool uses “rolling proximity identifiers,” changing the ID every 15 minutes, thus making it harder (but not completely impossible) to identify the individual. The information is also stored locally and not in a cloud or server, and does not collect personally identifiable information or location data.

However, as it currently stands, this system is vulnerable to a few flaws.

First, Bluetooth signals are not secure. Signals can be imitated (for instance, by tying your phone to your dog or cat and letting it roam around the neighborhood) or harvested, so if someone grabbed a random ID and broadcast it in a different location, it could show that ID in two different locations simultaneously. This could undermine trust in the system and lower the likelihood of people taking these alerts seriously.

Second, the criteria for what counts as exposure is not yet clear. If it’s too sensitive, or not sensitive enough, this could also undermine the success of such a system. The effectiveness of this system will depend on how it sets these criteria. It is also not clear how the tool will be vetted, if at all, and by whom.

And finally, its success hinges on there being adequate testing, which in the U.S. at least is currently not the case.

Can you trust this system?

There’s no getting around the fact that this app will require significant user trust, which Apple and Google seek to earn. This will depend on how robust this system is from abuse, be it by governments, third-party actors, or themselves. As Jennifer Granick from the ACLU told Technology Review, “People will only trust these systems if they protect privacy, remain voluntary, and store data on an individual’s device, not a centralized repository.”

Ceinwen focused on digital privacy, censorship, and surveillance, and has interviewed leading figures in tech.