Skip to content

Video Series: Are Coronavirus Contact Tracing Apps Safe?

Are contact tracing apps safe?  Dr Carissa Véliz (Oxford), author of ‘Privacy is Power’, explains why we should think twice about using such apps. They pose a serious risk to our privacy, and this matters, even if you think you have nothing to hide!

Share on

4 Comment on this post

  1. Thanks for this Carissa (and Katrien).
    I have the impression that the message is that we should take less risks when it comes to potential privacy violations than we are already taking with other measures, most notably the risks of lockdown. You say any app would need to be “super secure’. That is a very high standard indeed. But there really is no scenario that is risk free or that is ‘super-secure’. Covid entails risks, lockdown entails risks (actually, more than just risks, if you look at the costs we are already paying in terms of job losses, neglecting other diseases, and ultimately public health and lives, educational gaps, and so on). The vaccine will inevitably entail some risks, needless to say. But given the situation, we need to take at least some risks.
    I don’t see why privacy should be different and more important than the other values at stake. As I see it, privacy cannot matter in emergency times as much as it matters in normal times. We have already made a decision that our liberties do not matter in emergency times as much as they matter in normal times. And if there is a second wave, maybe we will have to make a decision that saving lives by preventing contagion does not matter in emergency times as much as it does in normal times, unfortunately, as society and the economy might otherwise collapse. All in all, the risks deriving from an app do not seem larger than the risks of lockdown or the risks of covid. People are dying and will die as a result of both, and the welfare of the population is significantly affected by both. People will not die if there are privacy infringements that reveal someone’s interactions or that expose information about one’s postcode or part of it (we can imagine scenarios where this might happen, but it takes some imagination), and it is hard to see how their welfare could be affected in a way that is comparable to the the way in which lockdown or covid affect it. Ideally, of course, we want an app that is both “super-secure” and super-effective, but assuming this is not feasible, as I see it we need to be prepared to trade security for effectiveness, *given the alternatives available*.
    Maybe we can draw an analogy between vaccines and apps: in the same way as vaccines need to go through careful testing before being approved, so should a potential app. But vaccines will inevitably entail some risks, however small (as all vaccines do). I don’t see why we should not accept some risks for the app too, if it works and helps to contain the spread of the virus while keeping society and the economy functioning.
    The point that the app needs to be effective to justify risks of privacy breaching, which you make midway through the interview, is something I agree with. But then you illustrate the point by suggesting that the app might not be effective (e.g. as you say you can be close to someone but with a wall in between, so there is the risk of false positive). This is true, but it is an empirical issue and it is not about privacy.
    But maybe I will read your book (as everybody should of course) and then see if I change my mind on this :).

    1. Thanks, Alberto. I think we may be in broad agreement. I’m not arguing that any risk to privacy is unacceptable. And the priority is indeed to save lives. What are unacceptable are *unnecessary* privacy risks. There are two ways in which there can be unnecessary privacy risks: if some objective can be achieved in a privacy-respectful way and is instead pursued in a privacy-invasive way, or if privacy is risked for no benefit in return, or for such a slim chance of a benefit, that it isn’t worth it. So empirical issues (e.g. how effective is an app) are definitely relevant. There have been cases in which computer scientists warned the appropriate government that their app would simply not work for technical reasons, for example. If we have an app but we are not testing enough people, the app will be useless. So I think we are in broad agreement. There is space for a safe and effective app, even with some privacy risks, but other conditions have to be in place for it to be worth the risk. Otherwise it’ll be a wasted effort and wasted resources. While privacy is not the only important consideration, I do think that people tend to think privacy losses are not very dangerous, and I think that’s mistaken. As I argue in my book, privacy losses can kill too, and they have in the past.

  2. The coverage of the interview is about control, power and security issues. The softer privacy issues are not covered in the early stages where the examples provided are hacking and misuse. Later the comments about palpable feelings of giving up privacy are delayed indicates that secondary facets are being covered rather than a core issue. Additionally within those secondary facets any socially focused use for purpose restrictions, are not mentioned beyond the intimation of misuse. There is more that a suspicion here that because those restrictions have been so strongly attacked over the last forty years they have lost any real credibility.
    To use the video interview itself to illustrate what I attempt to comment upon above: It is conducted within a privacy bubble allowing the content to be controlled by the two participants. The communications mechanisms in use allow an uninterrupted interview because of adequate security of that system and its hardware. If the privacy of the interview had been breached it is possible the security of any information technology systems would have supported the continuation of that private conversation, but that support is itself not privacy. Equally the security surrounding any covert listener at the time of the interview would have assured their own privacy in a different way. Security and control are a great distraction as they do relate to privacy but I contend they are not its core because those aspects appear as conceptually based social reflections of more nuanced material, hence privacy is not power, although it may become interpreted in that way because any personal material is very frequently used in a disadvantageous way for the individuals concerned. Just as anonymity and other social, moral or legal mechanisms attempting to protect individuals or social groups, but as frequently protect existing social structures, become subsumed in a broadly construed thing called privacy.
    Have to accept time considerably limits the content covered in the interview. I hope the book draws that out and look forward to reading it.

  3. The app must be very 1005 effective in didicating cos many lives will be at risk but honestly there is ano way to prove that cos it was made by a human and mistakes are bond to happen.

Comments are closed.