Skip to content

Spying on people for fun and profit

A new company, Internet Eyes, promises to crowdsource monitoring of surveillance cameras by using online users to watch footage and report suspicious activity. They would get rewarded 'up to £1,000' if they press the alarm button to report something useful. Not unexpectedly the anti-CCTV groups really dislike the idea. The Information Commissioner is somewhat sceptical but allowed a beta test to go ahead, as long as users had to pay for using it – this would allow their details to be checked and would reduce risks for misuse. However, at least one subscribe "thought it was his civic duty to sign up". Civic duty or profit-making voyerism?

The system works by showing camera footage from subscribing stores and facilities on the screens of users. The users cannot select what cameras they see through, will not get footage from their own area and the cameras will change every 20 minutes. The cameras are already there and recording, it is just that currently most footage will not be monitored in realtime.

Privacy advocates worry that people might want to record the stream and then spread it online. At first this seems pretty unlikely, after all most of the footage will show convenience stores where little or nothing is happening. But if you are sitting there, hoping for a robbery and a big payout, wouldn't you like to have evidence of what you saw? And if you record something amusing (someone slipping on a banana peel), why not post it to YouTube? However, this is not much different from the problem of people being able to record each other in (semi)public. At any point in time I can whip up my iphone and record, and wearable cameras recording all the time are coming on the market.

The commercial angle is disturbing, but I get the impression that the original proposal was free for users – if you have to pay for participating you really need to witness something giving you a payoff to break even. The real profit to the company from the camera-owners. A mature system must be interesting enough to draw and keep users. However, one could easily make a lottery solution: people pay, but often enough randomly footage from a real past crime might be shown and if the user presses the response button they get paid (keeps them coming back) and also checks their vigilance (useful for the company). This could draw users (just consider slot machines), but also has the worrying effect of giving people an inflated estimate of crime rates. Given that such inflated views then affect personal, social and political choices this might be problematic.

My real worry about the crowdsourced monitoring is bias. It seems very likely that most people will focus attention based on intuitive judgements, and these are often biased: a dark skinned person is more likely to be noticed by a little old lady. If either of them shoplifts, the dark person will be more likely to be reported. There might also be judgements here that are not so much about criminal behavior than as what is judged as inappropriate behavior by the viewer. A button press can make the store manager aware that someone thinks something bad is occuring in front of the cereal shelf, making him walk over and interfere – even if the act that caused the annoyance was minor or even accepted by most people (like a gay couple kissing, viewed by a member of a conservative community).

ICO states that "Our CCTV code of practice makes it clear that CCTV operators should use appropriately trained staff to monitor images." The ability to detect crimes might be better in trained people, but having multiple eyes can probably outdo this. In an ideal world these operators would also be trained to not be biased and have clear codes on what to report. In reality I doubt this is the case: it is not clear that they would be less biased than the public.

The problem seems to be that the motivations for users to participate are either profit, a desire to watch others or misplaced community spirit (since you cannot watch your community). While as a consequentialist I do not think motives matter much, they are clearly not driving the system towards being non-biased. For store owners it might indeed provide useful security or information, but it also makes them potentially complicit in biased and unaccountable social control.

Share on

4 Comment on this post

  1. It’s like someone read the Panopticon passages from Discipline and Punish as a how-to-guide.

    Everything we’ve seen about problems involved with these “public-private partnerships” (I take this company will be reporting to the police authorities, ultimately) has taught us nothing.

  2. There’s also the possibility (and I believe the same point has been made in connection with CCTV crowdsourcing on the US/Mexican border) that naughty people will collaborate (as if…). Thus, a false alert will be raised in one place (the cereal aisle) so as to divert attention from something happening elsewhere (the booze counter)…

  3. Yes, diversion tactics likely works well. It has been used for a long time, and crowdsourced cameras are unable to help much against accomplices.

    Hmm, mining Michel Foucault for business plans. Now that is a creepy/intriguing idea. The care of the self is of course already big business, but I wonder what we can do with madness?

    There are likely potential principal-agent problems here. The company wants to make money, the customers want increased security, the viewers want entertainment/a chance to profit. As in much surveillance, perceived increased security is more important than real increased security: the system could do a bad job yet be regarded as good by the customers.

Comments are closed.