Skip to content

surveillance

Cross Post: Privacy is a Collective Concern: When We Tell Companies About Ourselves, We Give Away Details About Others, Too.

BY CARISSA VÉLIZ

This article was originally published in New Statesman America

Cross Post: Think Twice Before Sending Facebook Your Nude Photos: The Shadow Brokers’ Disclosures Prove Privacy and Security Are Not a Zero-Sum Game

 

Written by Dr Carissa Veliz

This article first appeared in El Pais

 

Time and again, we have been sold the story that we need to give up privacy in exchange for security. According to former NSA security consultant Ed Giorgio, ‘Privacy and security are a zero-sum game’—meaning that for every increase in one, there is a decrease in the other. The go-to argument to justify mass surveillance, then, is that sacrificing our privacy is necessary for government agencies to be able to protect us from the bad guys.Read More »Cross Post: Think Twice Before Sending Facebook Your Nude Photos: The Shadow Brokers’ Disclosures Prove Privacy and Security Are Not a Zero-Sum Game

Hide your face?

A start-up claims it can identify whether a face belongs to a high-IQ person, a good poker player, a terrorist, or a pedophile. Faception uses machine-learning to generate classifiers that signal whether a face belongs in one category or not. Basically facial appearance is used to predict personality traits, type, or behaviors. The company claims to already have sold technology to a homeland security agency to help identify terrorists. It does not surprise me at all: governments are willing to buy remarkably bad snake-oil. But even if the technology did work, it would be ethically problematic.

Read More »Hide your face?

Global surveillance is not about privacy

putin-merkel-obama-caricatureIt has now been almost two years since Snowden. It’s time for us to admit this has little to do with privacy. Global surveillance is not global only because it targets people all over the world. Global surveillance is done for and against global interests. Privacy, by contrast, is an individual right. It’s simply the wrong description level. This is not about your internet history or private phone calls, even if the media and Snowden wish it were.

Privacy is rarely seen as a fundamental right. Privacy is relevant insofar as it enables control, harming freedom, or insofar as it causes the violation of a fundamental right. But the capabilities of intelligence agencies to carry out surveillance over their own citizens are far lower than their capability to monitor foreigners. Any control this monitoring might entail will never be at the individual level; governments can’t exert direct control over individual citizens of foreign countries.

.

Framing this as an issue of individual privacy is a strategic move done against the interests of individuals. Read More »Global surveillance is not about privacy

Computer vision and emotional privacy

A study published last week (and summarized here and here) demonstrated that a computer could be trained to detect real versus faked facial expressions of pain significantly better than humans. Participants were shown video clips of the faces of people actually in pain (elicited by submerging their arms in icy water) and clips of people simulating pain (with their arms in warm water). The participants had to indicate for each clip whether the expression of pain was genuine or faked.

Whilst human observers could not discriminate real expressions of pain from faked expression better than chance, a computer vision system that automatically measured facial movements and performed pattern recognition on those movements attained 85% accuracy. Even when the human participants practiced, accuracy only increased to 55%.

The authors explain that the system could also be trained to recognize other potentially deceptive actions involving a facial component. They say:

In addition to detecting pain malingering, our computer vision approach maybe used to detect other real-world deceptive actions in the realm of homeland security, psychopathology, job screening, medicine, and law. Like pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve dual control of the face. In addition, our computer vision system can be applied to detect states in which the human face may provide important clues about health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness and students’ expressions of attention and comprehension of lectures, or to track response to treatment of affective disorders.

The possibility of using this technology to detect when someone’s emotional expressions are genuine or not raises interesting ethical questions. I will outline and give preliminary comments on a few of the issues:Read More »Computer vision and emotional privacy

How to get positive surveillance – a few ideas

I recently published an article on the possible upsides of mass surveillance (somewhat in the vein of David Brin’s “transparent society”). To nobody’s great astonishment, it has attracted criticism! Some of them accuse me of not knowing the negative aspects of surveillance. But that was not the article’s point; there is already a lot written on the negative aspects (Bruce Schneier and Cory Doctorow, for instance, have covered this extremely well). Others make the point that though these benefits may be conceivable in principle, I haven’t shown how they could be obtained in practice.

Again, that wasn’t the point of the article. But it’s a fair criticism – what can we do today to make a better surveillance outcomes more likely? Since I didn’t have space to go through that in my article, here are a few suggestions:Read More »How to get positive surveillance – a few ideas

Lying in the least untruthful manner: surveillance and trust

When I last blogged about the surveillance scandal in June, I argued that the core problem was the reasonable doubts we have about whether the oversight is functioning properly, and that the secrecy makes these doubts worse.  Since then a long list of new revelations have arrived. To me, what matters is not so much whether foreign agencies get secretly paid to spy, doubts about internal procedures or how deeply software can peer into human lives, but how these revelations put a lie to many earlier denials. In an essay well worth reading Bruce Schneier points out that this pattern of deception severely undermines our trust in the authorities, and this is an important social risk: democracies and market economies require us to trust politicians and companies to an appropriate extent.

Read More »Lying in the least untruthful manner: surveillance and trust

How to deal with double-edged technology

By Brian D. Earp

 World’s smallest drone? Or how to deal with double-edged technology 

BBC News reports that Harvard scientists have developed the world’s smallest flying robot. It’s about the size of a penny, and it moves faster than a human hand can swat. Of course, the inventors of this “diminutive flying vehicle” immediately lauded its potential for bringing good to the world:

1. “We could envision these robots being used for search-and-rescue operations to search for human survivors under collapsed buildings or [in] other hazardous environments.”

2. “They [could] be used for environmental monitoring, to be dispersed into a habitat to sense trace chemicals or other factors.”

3. They might even behave like many real insects and assist with the pollination of crops, “to function as the now-struggling honeybee populations do in supporting agriculture around the world.”

These all seem like pretty commendable uses of a new technology. Yet one can think of some “bad” uses too. The “search and rescue” version of this robot (for example) would presumably be fitted with a camera; and the prospect of a swarm of tiny, remote-controlled flying video recorders raises some obvious questions about spying and privacy. It also prompts one to wonder who will have access to these spy bugs (the U.S. Air Force has long been interested in building miniature espionage drones), and whether there will be effective regulatory strategies capable of tilting future usage more toward the search-and-rescue side of things, and away from the peep-and-record side.

Read More »How to deal with double-edged technology

On being private in public

We all know that we are under CCTV surveillance on many occasions each day, particularly when we are in public places. For the most part we accept that being – or potentially being – watched in public places is a reasonable price to pay for the security that 24-hour surveillance offers. However, we also have expectations about what is done with CCTV footage, when, and by whom. A recent discussion with a friend threw up some interesting questions about the nature of these expectations and their reasonableness.

My friend works in a bar where, unsurprisingly, there are several CCTV cameras. Everyone knows where these cameras are and that they are permanently in operation – there is not supposed to be any secrecy. Whilst the primary purpose of the cameras is to promote security, a member of the management team has begun to use them in a way that could be seen as ethically problematic: she logs on to view the footage in real-time, remotely, at her home. In addition to watching the footage, the manager has also addressed points of staff discipline based on what she sees. Perhaps particularly troubling is that she has commented on the way a member of staff behaved when no one was around – when the member of staff thought that she was ‘alone’.Read More »On being private in public

Enlightened surveillance?

New York City contemplates using aerial drones for surveillance purposes, while North Korea buys thousands of cameras to spy on its impoverished population. Britain has so many cameras they cease being newsworthy. The stories multiply – it is trivial to note we are moving towards a surveillance society.

In an earlier post, I suggested surrendering on surveillance might be the least bad option – of all likely civil liberty encroachments,  this seemed the less damaging and hardest to resist. But that’s an overly defensive way of phrasing it – if ubiquitous surveillance and lack of privacy are the trends of the future, we shouldn’t just begrudgingly accept them, but demand that society gets the most possible out of them. In this post, I’m not going to suggest how to achieve enlightened surveillance (a 360 degree surveillance would be a small start, for instance), but just outline some of the positive good we could get from it. We all know the negatives; but what good could come from corporations, governments and neighbours being able to peer continually into your bedroom (and efficiently process that data)? In the ideal case, how could we make it work for us?Read More »Enlightened surveillance?