Panopticon Problems: Purists rather than Privacy?
Would a transparent society where we could all watch each other be a better society? A recent paper has made me wonder whether the big problem might not be privacy, but purists.
The standard argument is that transparency (in its different forms) makes people behave well because they can be held accountable: either when they are doing something bad, or when others look back and find out what they did. So there is an incentive for doing things right.
People try to safeguard their reputations, and often behave more generously and cooperatively when observed by others. People also behave themselves when feeling watched, presumably because the internal machinery for reputation management has a long evolutionary background – we benefited from it long before we could think about social game theory, so it presumably runs on prerational hardware somewhere in our brains. This is why merely showing images of eyes is (sometimes) enough to affect behaviour. We rationally know they are not actually seeing us, but they affect the old systems anyway.
This might be an argument for having plenty of visible CCTV cameras – they induce pro-social behaviour both by discouraging criminals by increasing the chance they will be held accountable and by making the law-abiding citizens behave slightly better!
We are likely well on the way towards a transparent society where everything and everybody are going to be potentially observable. A whole bunch of identity technologies (RFID, object recognition, person recognition, biometrics, …) are maturing. Sensors are becoming cheaper, smaller and embedded in ubiquitous devices like cellphones and computers. Storing and networking the information is becoming cheap and often the default condition. Technologies for mining such information are advancing rapidly. As David Brin pointed out, a key issue in a transparent society is whether it is two-way transparency or one-way transparency; however, as Wikileaks demonstrated, it might be hard to effectively maintain secrecy for any group.
The real problem with a transparent society might not be privacy but tolerance. Privacy norms appear to be fairly flexible social constructions, and could presumably adapt to a world where anybody can in principle see anybody else and what they did. We desire privacy and give each other privacy as part of our social context.
However, a new paper suggest that feeling watched also influences our moral behaviour: People more strongly condemn bad behaviour when cued that they are being watched (original paper: Pierrick Bourrat, Nicolas Baumard and Ryan McKay, Surveillance cues enhance moral condemnation, Evolutionary Psychology, 9:2 193-199 2011). They demonstrated that surveillance cues made students regard bad acts as less morally acceptable.
This is unsurprising, but bodes badly for the tolerance of a transparent society. People would not only be able to observe more transgressions against their norms, but would publicly view them as worse. They might of course exaggerate the denouncement – talk is cheap, after all – and it might not necessarily carry over into action or policy. But as the authors note:
Indeed, failure to express our support for prevailing moral norms may arouse suspicion in our conspecifics. Talk, however, is cheap, so we are wise to take such declarations with the proverbial grain of salt, discounting them as appropriate. In consequence, individuals may attempt to compensate for this discounting by ramping up their rhetoric. To the extent that these compensatory efforts are discounted in turn, they may be ultimately futile; what sustains them is the fact that failure to send the inflated signal immediately brands the deviant as morally suspect.
If this holds (and it seems likely), then the introduction of vastly greater surveillance powers would produce a lot more moral condemnation and competition in the condemnation. This is of course strengthened by the increased transparency: condemnation becomes more visible. There is also heightened penalties for hypocrisy: now you can even check if your neighbour actually does as he preaches. And he will of course check you.
In many cases this might be entirely desirable. It might help us behave better. But it is unlikely to make us saints: there is going to be enough transgressions for anybody to condemn us. Maybe this leads to a social equilibrium where people would abstain from excessive moral rock-throwing because they know they may become targets themselves. But not all people have the same social capital or support: socially marginal people like minorities or stigmatized groups would be far more vulnerable – and likely – as targets. What is a transgression is also floating: being gay or the ‘wrong’ religion is viewed by many as a moral matter. Purists well anchored inside big communities would be able to gain status and support by directing opprobrium against outsiders, getting many people to join in because they both show their affiliation and reduce the risk of being targeted. A transparent society could be a fairly stable intolerant society.
There is a risk that the transition from current, humanely opaque and local, societies into very transparent and global societies might occur far faster than most people expect. We might not desire giving loud busybodies too much say over our lives (especially since they tend to have the most simplistic moral views), but avoiding it might be a tough moral, social and psychological problem.
Figuring out how to live in a more transparent and more globalized world (where numerous communities with wildly different mores are transparent to each other) is one of the key challenges right now, in the early days of the information revolution.