https://e27.co/can-your-data-actually-be-anonymous-20190804/News broke last April, at the height of the pandemic fears, that Google, Apple, and MIT were working together to build a smartphone app that would conduct contact tracing much faster and more efficiently than the old way of hiring humans to do the same thing.
Contact tracing is the laborious process by which human investigators backtrack the recent movements of a person who has tested positive for COVID-19, in order to locate all the people who might have come in contact with them.
Obviously, when performed by humans, this is a time-consuming and imperfect way to try and contain the virus, but it’s all we’ve had - until now. There are now a few questions before us. Will enough people use the new Google/Apple/MIT (GAM) app to make it worthwhile? After all, it’s voluntary (so far). There’s no doubt that, in a perfect world, the process would be faster and more effective than human fumbling but if, say, only 10% of diagnosed positive cases use it, what’s the point?
Some people are also interested in how much of our data the new app will collect, store, and use, and in what manner. Is this going to be a trampling of privacy or will the brain trust be able to gather useful data while maintaining boundaries? An even more basic question is, which do we as a society value more -- health or privacy?
These are tricky questions without easy answers.
How the Contact Tracing App Works
The GAM app is a cross-platform effort that will be available to both iPhone and Android users. The plan is for the developers to build the API (application programming interface) and then let individual health agencies incorporate the contact tracing technology into their own apps.
The contact tracking technology actually resides in the phone’s operating system, so a user first needs to download and install the app, then update the OS to get started. Once you have opted in, the app goes to work by sending out a ping powered by a random Bluetooth identifier that is allegedly not able to be linked to the phone owner’s actual identity.
No matter what happens to the data in the future, whether stolen by a hacker or included as part of a standard backup process at some point down the road, there should be no way to connect an individual’s identity to any other part of the data. More on that later.
Other phones with the app loaded do the same thing. Any time one phone gets within a set distance of another phone, the encounter is logged as a contact. If you receive a positive coronavirus test result, you enter that into the app and automated messages go out to anyone you have been in contact with.
It’s a simple, effective process that theoretically should work like a charm. The problem, as the government of Singapore found out in the midst of the rising pandemic, is that only about 12% of people actually used the app. Such a low compliance rate makes it almost worthless. Why such little interest in the concept of automatic contact tracing? For a great many, it probably just seemed like a hassle but, for more than a few, there are privacy concerns.
Senators Query Apple and Google About Privacy Concerns
Not long after the GAM project was announced, Google and Apple received a letter from a group of U.S. senators expressing reservations about how the data collected through the contact tracing app would be identified, collected, stored (there is always the risk of losing it to phishing attacks), and perhaps ultimately used for marketing purposes once COVID-19 is far in our rearview mirror.
The major questions:
- Will the data collected fall under the jurisdiction of HIPAA, the country’s major health data privacy law?
- Will the data collected be regulated by other major data privacy regulations like the GDPR and CCPA?
- Will any data collected be personally identifiable to specific individuals?
- Will the data collected ever be monetized?
Understandable questions to be sure, though Apple feels they have all been addressed through statements on the COVID-19 tool landing page, which asserts that users’ answers to screening questions are not collected and no personal data is retained that would allow any future versions of the company to connect data to a person. Also, no sign-in or password creation of any kind is required to use the tool.
At least on the surface, it appears the Google/Apple collaboration has no designs on collecting a massive dataset and then turning around and using it for marketing purposes.
MIT contributes to the app’s privacy: While the MIT designers created the app to collect values that are only stored on a list as random numbers and distances between them - no identifiable information related to a particular phone, user, email, or name - that protection only applies to the product before it is installed in a particular app, such as the CDC.
Remember that any healthcare organization that wants to use the GAM technology can do so free of charge, but they have to modify it to work with the particular operating system code before distribution. At this point, any guarantee of privacy is out of the hands of the GAM trio of companies. Account creation and safe password management could be applied to the process at any time, even though it’s wiser to use a reliable password from the very beginning.
The Problem with Privacy
As security experts have been pointing out for years, it is difficult to truly anonymize or even protect data from seemingly everyday threats, regardless of a company’s good intentions. The question, as already alluded to, is whether mass public health threats like the current pandemic should take precedence over privacy concerns.
To date, most controlling regulations in this area arise in the context of a commercial environment, where companies collect data in order to target their marketing efforts more precisely. Should standards be relaxed when people are dying from a still poorly controlled disease? That is the crux of the matter, and a slippery slope if ever there was one.
The issue of relaxing privacy regulations during the pandemic would seem to be a natural task for governments but, other than the GDPR and CCPA, politicians and bureaucrats around the world have shown reluctance to get involved, perhaps out of fear that the genie is already out of the bottle and will refuse to go back in.
Regardless, security concerns such as those raised by the introduction of contact tracing apps into society aren’t going to disappear magically in a fairy-tale ending. These are questions that will require a resolution eventually. Not to decide is still to make a choice. For now, as the Romans used to say, “Caveat emptor”.