Categories
Uncategorized

Beware of Governments Bearing Contact Tracing Apps

Providing hospital personnel and the general public with enough tests, masks and hand soap is hard. In-person contact tracing is expensive and risky for the contact tracer, and existing numbers of trained contact tracing personnel are far below what’s needed. So it’s no wonder that governments across the world have suddenly warmed right up to the idea of contact-tracing apps. They offer a vision of cheap, safe quarantine enforcement, transferring much of the effort of reining in the pandemic away from them and onto you and me.

Unfortunately, as things stand, the vision is largely a mirage.

Let’s talk about what would be needed for a contact tracing app to work. It requires mass testing first, coupled with cultural shifts, like mass public mask-wearing, and economic measures, like giving people cash to survive without leaving the house. The kind of testing needed to facilitate the rollout of a contact tracing app necessarily involves random testing of asymptomatic people, and that’s only happening in a few places in the US right now. If testing is treated only as a tool for making clinical decisions about how to treat symptomatic patients, instead as a way of tracking and controlling the pandemic as a whole, then contact tracing apps won’t do much good at all.

Building a contact tracing system without accidentally creating new, permanent location-gathering vectors is going to be very difficult. Apple and Google have one, non-open-sourced effort afoot, which has some serious privacy and capacity flaws. Their work is based on the open-source DP-3T protocol, which would credibly prevent governments from co-opting and reusing the data for other purposes, but it’s not clear whether their proprietary variant also succeeds in this. Another coalition of serious cryptographers is designing a “CEN” system where an anonymous number is generated to privately record interactions between compatible mobile devices without allowing them to be tracked.

If a contact-tracing app is voluntary and anonymous by design, that also brings with it its own set of problems. As Ross Anderson puts it, “the performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run [denial-of-service] attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.”

Even an otherwise well-designed app may suffer from disadvantages that depend on the method chosen for tracking location. If Bluetooth is used, as with Singapore’s already-deployed OpenTrace, class 2 Bluetooth receivers have a range of up to 10 meters, so the detectable ping range may encompass more than 20 times the area generally carved out for social distancing. Bluetooth pings don’t account for whether you’re in another vehicle or another room, or what kind of protective equipment you may be wearing.

The success of the app would likely depend on saturation takeup by the general public. First, it would have to be installed by an infected person, and used in an always-on manner. Then, a large number of other people would have to also be using it whenever they go out. This is a tall order for any app, even leaving aside the fact that only three-fourths of American households have smartphones at all. And mandating its takeup would create a new series of challenges, because many people would then treat the app as malware.

If a decentralized data storage protocol is not used, all of this location data would be aggregated in a single data set. That is, timestamps, GPS coordinates, proximity hits, and other information would be processed on a massive scale. The computational power required would be much more than would be normally justifiable for a presumably free app. So, if developed privately with an eye on profit, the next question is whether the cost would be covered, in whole or in part, by repurposing the data. Geolocation has commercial uses to target ads, but it’s obvious that it’s governments who will be most interested in long-term mass tracking of population movements. A successful, centralized app, as Robert Chesney observes in his analysis of the legal issues around app development, “would be far more dangerous, from a civil liberties perspective, than the intensely controversial (and recently expired) telephone metadata program that the U.S. government established after 9/11 for counterterrorism purposes. Such a system would have extraordinary potential for abuse.”

Of course, after the pandemic, a mandated government app could be sweetened with consumer “benefits” in order to encourage its continued use, and the back end processing could be used to encourage/discourage movement to/away from preferred areas. Justified by public health concerns, its use could be required by law, or required in order to access public benefits.

Perhaps, as a people, we’re not yet ready for a US equivalent of the Xi Jinping Thought app; but within our government, we already know that there are many people who think that mass metadata surveillance programs are a good idea. So now is the time to approach such proposals with extreme care, to make sure that they don’t become an easy distraction from the hard task of minimizing the spread of this deadly disease.

This article is a collaborative effort between RT4 activists; particular thanks to riastradh and reloquent for technical advice.


6 replies on “Beware of Governments Bearing Contact Tracing Apps”

Some questions/comments:
Where does the Apple document say closed source? Google and Apple aren’t even building apps as far as I can tell, just an API.
The Marlinspike critique seems pretty speculative. For example, he says that you have to download IDs of all new infections but that would be for a single global app. If each state or region has its own app, then you just have to download a few thousands IDs each day, even during a phase like the current one.
Anderson’s worry is not relevant if people can only declare themselves as infected if they’ve received a positive test from the public health authorities.

Are you basing this statement on particular rulings, or is it a normative statement that you believe the courts ought to hold it unconstitutional?

Thanks for these helpful comments! There’s now more clarity about Apple’s and Google’s approach than when we devised the article; Moxie Marlinspike’s criticisms as of April 10 are not all applicable given the direction Apple and Google have been taking. This Techcrunch article seems to be a good overview of where the effort was at as of Friday.

https://techcrunch.com/2020/04/24/apple-and-google-update-joint-coronavirus-tracing-tech-to-improve-user-privacy-and-developer-flexibility/

If people “can only declare themselves to be infected if they’ve received a positive test from the authorities,” that would seem to accentuate the risks of deploying these systems without comprehensive testing regimens in place; Ross Anderson’s concerns probably apply better to that kind of world.

Contract tracing is 100% unconstitutional.
UNLESS YOU VOLUNTEER. Then you are screwed.

Opt out here:
bit.ly/NO-HR-6666

Are you basing the statement that “Contact tracing is 100% unconstitutional” on particular rulings, or is it a normative statement that you believe the courts ought to hold it unconstitutional?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.