Skip to content

Lying in the least untruthful manner: surveillance and trust

When I last blogged about the surveillance scandal in June, I argued that the core problem was the reasonable doubts we have about whether the oversight is functioning properly, and that the secrecy makes these doubts worse.  Since then a long list of new revelations have arrived. To me, what matters is not so much whether foreign agencies get secretly paid to spy, doubts about internal procedures or how deeply software can peer into human lives, but how these revelations put a lie to many earlier denials. In an essay well worth reading Bruce Schneier points out that this pattern of deception severely undermines our trust in the authorities, and this is an important social risk: democracies and market economies require us to trust politicians and companies to an appropriate extent.

Most people say politicians lie. What they normally mean seems to be more like Frankfurt’s notion of bullshit: making statements to give a good impression or manipulate people, but not truly caring about whether they are true or false. In contrast, a deceptive person does care about truth, and tries to keep it from the audience. When a government commissions a report, making sure it will favor a policy it likes, it is essentially producing bullshit.

In the case of the statements by James Clapper and the many corporate denials of that they were helping authorities perform massive surveillance, they might have been technically speaking the truth but twisting the words rather severely.  In many cases these semantic games were due to strict secrecy rules making revealing certain information unlawful. They may have cared about the truth and spoken the truth, but they also made sure to distort things in order to reach a desired manipulative end.

ANDREA MITCHELL: Senator Wyden made quite a lot out of your exchange with him last March during the hearings. Can you explain what you meant when you said that there was not data collection on millions of Americans?

JAMES CLAPPER: First– as I said, I have great respect for Senator Wyden. I thought, though in retrospect, I was asked– “When are you going to start– stop beating your wife” kind of question, which is meaning not– answerable necessarily by a simple yes or no. So I responded in what I thought was the most truthful, or least untruthful manner by saying no.
(NBC news)

In the case of XKeyscore, we get another interesting response:

The files shed light on one of Snowden’s most controversial statements, made in his first video interview published by the Guardian on June 10.

“I, sitting at my desk,” said Snowden, could “wiretap anyone, from you or your accountant, to a federal judge or even the president, if I had a personal email”.

US officials vehemently denied this specific claim. Mike Rogers, the Republican chairman of the House intelligence committee, said of Snowden’s assertion: “He’s lying. It’s impossible for him to do what he was saying he could do.”

Mike Rogers was himself either lying or did not know about the program. It would actually be more reassuring to assume he was lying, since that indicates at least some aspect of oversight does work (even if it is lying to the public). If he did not know, then this is evidence that congressional oversight does not work. He responded to the news by stating that:

Further, the program referenced in the story is not used for indiscriminate monitoring of the internet, as many falsely believe.  Rather, the program is simply a tool used by our intelligence analysts to better understand foreign intelligence, including terrorist targets overseas.  Finally, the story also once again ignores the legal constraints, comprehensive training, and layers of oversight built into all NSA programs.  Every search on the program by an NSA analyst is fully auditable to ensure it is done within the law.

The problem by now is by now that the reader is primed to assume that he might be lying (saying something known to be untrue), deceptive (saying something technically true but manipulative), or be uninformed. Oh, he might actually be truthful too. The corrosive effects on trust due to secrecy (deliberate withholding of true information) and bullshit (generating positional noise) are pervasive. Even if one does trust Rep. Rogers there is no way of confirming that this trust if appropriate, beyond induction from past trustworthy behavior… which might of course be doubted too. And conversely, maybe Snowden and The Guardian are deceptive or lying. The corrosion just goes on and on.

Trust but verify

The antidote is verification: are there credible sources, are there ways of independently checking the evidence? Do we see evidence for good results? When trust in authorities is in doubt, arguments from authority have even less power than normally.

The problem is that over-broad secrecy is the enemy of verification. It makes it impossible to watch the watchmen. The fundamental mistake that seems to have taken hold of the intelligence-political complex is that it is impossible to openly discuss secret activities without revealing the secrets held. This is generally false: PRISM could have been a publicly known project and still gathered useful data using means that were not public (the fact that some bad people would have refrained from using certain communications as a response is rather minor: in the real world, if they were competent they, like most of the computer security world, would have assumed there was massive surveillance even without verification). The public PRISM project could have been overseen far more transparently, and it would have been easier to trust the various agents – simply because the deception-inducing effects of secrecy would not have applied to the whole structure, just to information flowing through it. We do not need to keep the existence of health records secret just because nobody but patients and their doctors are supposed to see them.

The paradox of the intelligence community is that it demands total transparency to its inquiries from the rest of the world (since there are known and unknown dangers out there), yet demands total secrecy about what it is doing and learning. The motivation for this secrecy is that the information is dangerous, which is true. But having agencies with such dangerous information is itself dangerous: insofar one accepts the motivation as valid, it follows that they need to be watched at least as intensely as the world and their trustworthiness constantly monitored. At the very least internal communications and computer use needs to be logged, mapped and audited. At present NSA admits it cannot even search its own emails.

One standard argument against the above kind of agency monitoring is that it risks creating vulnerabilities for espionage and leaks. But the same argument applies to large scale data gathering too: the NSA is itself a potential security vulnerability. The fact that Manning and Snowden could cause destructive leaks despite being part of a supposedly vetted and trained group of people under close oversight suggests that the institutional problems across the community might be deep and unavoidable. If the intelligence community misplaces trust by their own standards so easily, why should we expect more about their external professionalism?

I do not trust people and organisations that do not give me ways of accessing the data they base their claims on. Most of the time it is too much work for anybody to check it, but availability does enforce honesty in an open society: somebody might be doing a check, and then it will be embarrassing to be found out as a liar, deceiver or bullshitter. The same is true for oversight inside organisations: not every case is audited, but if enough are (and it is random enough) there are disincentives for misbehaving. Accepting secrecy means accepting that there are things that should not be checked.

If bullshit is not caring about the truth of one’s own statements, accepting secrecy on blind trust is not caring about the truth of what one hears.

 

Share on

2 Comment on this post

  1. One possible way of slightly reducing the magnitude of the problem might be strengthening the credibility of the oversight. In the UK, the Intelligence and Security Committee which oversees GCHQ does not really appear to an observer to be a very trustworthy body. A lot of its members might qualify as members of ‘the Establishment’ with little incentive to rock the boat and substantial incentives to go along with all sorts of programmes the public might not think desirable. If they became more transparent about their oversight activities (or if their membership was adjusted – what about the idea of drawing lots from among MPs for at least some of the committee’s members?), that might increase public confidence. Another idea might be to release records of some of GCHQ’s activities, suitably anonymised and with sensitive information redacted, that could at least confirm that records are being kept for oversight purposes (and presumably looked at by the relevant oversight bodies). Now that Snowden’s revelations have come out, this could perhaps provide a way forward for greater transparency as a matter of normal routine. Surely somebody in the government should be able to think of some creative ways to demonstrate accountability and oversight to the public without compromising sensitive data.

    Just as a side point, I wonder what The Guardian’s security systems are like? From the slides they have revealed so far, it seems like they have redacted quite a lot of information that they don’t think it’s helpful to reveal to the public, but perhaps some cyber actors out there might have an interest in getting the full picture.

  2. An interesting piece, thanks.

    Dishonesty in this realm is particularly powerful, because our brains are wired to exaggerate the threat of terror that surveillance systems are supposed to protect us against. By overemphasizing that threat, we’re predisposed to giving in to whatever sacrifices of liberty the government is asking in exchange for its rather flimsy promise of security, and to discard the drive to verify what the government is telling us. To learn why (since links are not permitted in these comments), Google “This is Your Brain on Terrorism”

Comments are closed.