Cryptographer Bruce Schneier says the false positives and negatives ‘are too great’
Influential security technologist Bruce Schneier has dismissed Covid-19 contact-tracing apps as potentially “worse than having no system at all”, arguing that comprehensive diagnostic testing would combat the pandemic more effectively.
The UK is developing a contact-tracing app as a tool to identity people who may have come into contact with someone with Covid-19 and warn them, thereby encouraging them to self-isolate and slow the spread of the virus.
The NHS app under development is slated to use an API released by Google and Apple earlier this month, designed to enable decentralized contact tracing.
The UK is far from alone in developing such technology with Singapore’s TraceTogether and Israel’s Hamagen apps offering other examples elsewhere in the world.
False alarm peril
Schneier, however, has argued that contact tracing isn’t effective because “the false positives are too great. And the false negatives are too great”.“Contact tracing uses Bluetooth or GPS and then Bluetooth to try to get within six feet. We know that the error rates of that are surprisingly high,” Schneier said during a BlackHat webcast on Thursday (April 16).
“And there’ll be a lot of instances where the system will register a contact when there actually isn’t one.”
False positives could arise, for example, when someone was “within two feet of someone who has Covid-19 for eight hours”, but there was a wall between them because they were in adjacent hotel rooms.
False negatives can arise when a contact isn't registered because people happen not to be carrying their smartphones.
That’s aside from the problem of getting people to install contact-tracing apps on a voluntary basis. Schneier suggested uptake might be only around 20%.
“We don’t know the percentage of contacts that result in infections, but that rate actually isn’t that high,” Schneier added. “So basically, I built a system that will do nothing but false alarm.”
“And when I have a system and it alarms, there’s nothing I can do. I can’t go get a test. So, we put the system in place. It’s going to false alarm. Everyone will say, ‘Don't use it. It doesn’t tell you anything useful’.
“You know, it’s stupid and we’ll ignore it. But building a system with that kind of error rate is even worse than having no system at all.
“What we need is ubiquitous, cheap, fast, accurate testing. That’s what will make a difference,” Schneier said.
Schneier is far from alone among technologists in harboring privacy and efficacy concerns about Covid-19 contact-tracing apps.
An essay on ‘Contact Tracing in the Real World’ by Cambridge University’s Ross Anderson lists a litany of stumbling blocks for the technology, including false positives and low voluntary uptake.
“I recognise the overwhelming force of the public-health arguments for a centralized system, but I also have 25 years’ experience of the NHS being incompetent at developing systems and repeatedly breaking their privacy promises when they do manage to collect some data of value to somebody else,” Anderson writes.
“The real killer is likely to be the interaction between privacy and economics. If the app’s voluntary, nobody has an incentive to use it, except tinkerers and people who religiously comply with whatever the government asks. If uptake remains at 10-15%, as in Singapore, it won’t be much use and we’ll need to hire more contact tracers instead.”
Anderson concludes: “Our effort should go into expanding testing, making ventilators, retraining everyone with a clinical background from vet nurses to physiotherapists to use them, and building field hospitals. We must call out bullshit when we see it, and must not give policymakers the false hope that techno-magic might let them avoid the hard decisions.”
Anderson mentions the false positive quandary in the context of a lecture theatre rather than adjacent hotel rooms – but the practical upshot is much the same.
“Bluetooth also goes through plasterboard. If undergraduates return to Cambridge in October, I assume there will still be small-group teaching, but with protocols for distancing, self-isolation and quarantine,” Anderson writes.
“A supervisor might sit in a teaching room with two or three students, all more than 2m apart and maybe wearing masks, and the window open. The bluetooth app will flag up not just the others in the room but people in the next room too.”
Hacking in the public interest
Schneier’s Black Hat webcast, titled ‘Hacking in the Public Interest’, focused on the role security technologies could step up to play in supporting social causes and protecting the public interest.
The idea is that in the same way lawyers regularly take pro-bono work to defend civil liberties and help the vulnerable, the security mindset could be applied to achieving comparable noble goals and informing policymaking, which increasingly intersects with technology in areas ranging from the deployment of 5G technologies to regulating social media companies.
During a question and answer session, Schneier warned that measures put in place by governments to deal with the Covid-19 pandemic could outstay their welcome.
“I worry that some of these surveillance measures that are being hastily put into for Covid-19 are going to remain there,” Schneier said.
“The surveillance industrial complex, cyber weapons arms manufacturers, companies like Hacking Team and NSO Group that are selling spy tools to countries around the world, are very quickly taking their tools and turning them into Covid-19 contact tracing tools.”
Schneier’s remarks are a reference to recent high-profile marketing efforts by the normally secretive NSO Group.
“It’s dumb. It doesn’t make sense. It's not suitable, but you know, we have their marketing presentations and they're getting traction. The worry is that countries are gonna buy this stuff, in some belief it'll help during the pandemic,” Schneier said.
“And then, surprise: they've now had a surveillance tool that can go after human rights workers and dissidents and the press and opposing political parties.
“I do worry about this kind of surveillance creep,” he added.
When there’s a crisis, it “makes sense that we do things that we wouldn’t do normally but we have to make sure that they are effective. They’re proportionate, they have transparency, and that we can undo them when the crisis is over.
“I think we’re failing at all of those steps. I worry the most about not being able to undo them [increased surveillance measures],” Schneier concluded.