Channel 4 was censured by Ofcom this week for cutting to a light-hearted sponsorship advert just after viewers had watched the particularly graphic and disturbing rape scene in the film The Girl with the Dragon Tattoo. The Phones 4 U sponsorship ad was thought to be especially inappropriate for that moment as it features a couple apparently having sex, during which the woman pauses and asks to the camera ‘I’m faking it, can I upgrade’? Ofcom received 17 complaints about the timing of the advert and this week concluded that ‘the juxtaposition of a light-hearted sponsorship credit featuring a woman during sex with a disturbing and distressing rape scene in a film was clearly unsuitable… In Ofcom’s view this clearly had the potential to be offensive to viewers’.
The timing was clearly unfortunate, but to say that the juxtaposition was offensive is a stronger claim. Of course, the psychological effect of being immersed in a violent scene at one moment and then confronted with the same(ish) subject matter presented trivially will not do much for the viewer’s aesthetic experience. But the regulator’s suggestion seemed not only to be that the juxtaposition detracted from the viewer’s enjoyment, but also that it was in some way wrong. Continue reading
Some researchers in the US recently conducted an ‘experiment in the law as algorithm’. (One of the researchers involved with the project was interviewed by Ars Technia, here.) At first glance, this seems like quite a simple undertaking for someone with knowledge of a particular law and mathematical proficiency: laws are clearly defined rules, which can be broken in clearly defined ways. This is most true for strict liability offences, which require no proof of a mental element of the offence (the mens rea). An individual can commit a strict liability offence even if she had no knowledge that her act was criminal and had no intention to commit the crime. All that is required under strict liability statutes is that the act itself (the actus reus) is voluntary. Essentially: if you did it, you’re liable – it doesn’t matter why or how. So, for strict liability offences such as speeding it would seem straightforward enough to create an algorithm that could compare actual driving speed with the legal speed limit, and adjudicate liability accordingly.
This possibility of law as algorithm is what the US researchers aimed to test out with their experiment. They imagined the future possibility of automated law enforcement, especially for simple laws like those governing driving. To conduct their experiment, the researchers assigned a group of 52 programmers the task of automating the enforcement of driving speed limits. A late-model vehicle was equipped with a sensor that collected actual vehicle speed over an hour-long commute. The programmers (without collaboration) each wrote a program that computed the number of speed limit violations and issued mock traffic tickets. Continue reading
Yesterday, Charles Foster discussed the recent study showing that Facebook ‘Likes’ can be plugged into an algorithm to predict things about people – things about their demographics, their habits and their personalities – that they didn’t explicitly disclose. Charles argued that, even though the individual ‘Likes’ were voluntarily published, to use an algorithm to generate further predictions would be unethical on the grounds that individuals have not consented to it and, consequently, that to go ahead and do it anyway is a violation of their privacy.
I wish to make three points contesting his strong conclusion, instead offering a more qualified position: simply running the algorithm on publically available ‘Likes’ data is not unethical, even if no consent has been given. Doing particular things based on the output of the algorithm, however, might be. Continue reading
We all know that we are under CCTV surveillance on many occasions each day, particularly when we are in public places. For the most part we accept that being – or potentially being – watched in public places is a reasonable price to pay for the security that 24-hour surveillance offers. However, we also have expectations about what is done with CCTV footage, when, and by whom. A recent discussion with a friend threw up some interesting questions about the nature of these expectations and their reasonableness.
My friend works in a bar where, unsurprisingly, there are several CCTV cameras. Everyone knows where these cameras are and that they are permanently in operation – there is not supposed to be any secrecy. Whilst the primary purpose of the cameras is to promote security, a member of the management team has begun to use them in a way that could be seen as ethically problematic: she logs on to view the footage in real-time, remotely, at her home. In addition to watching the footage, the manager has also addressed points of staff discipline based on what she sees. Perhaps particularly troubling is that she has commented on the way a member of staff behaved when no one was around – when the member of staff thought that she was ‘alone’. Continue reading
When people do bad things – especially when they cause a lot of harm to others – we usually hope that they will experience something like remorse: that they will feel horror at the thought of what they did to the person harmed, that they will resolve to avoid causing similar harm in the future, and that they will be motivated to apologise and offer reparation, where possible. Penal systems in some jurisdictions deem remorse so important that it is considered a valid reason to mitigate the amount of punishment the offender receives. But, what happens to our expectations for emotion if the person cannot remember committing the offence; if he feels so detached from it that it is as if he did not commit it? An interesting case from Poland raises this question.
Maciej Zientarski was a celebrity driver on a TV programme similar to our Top Gear. On the 27th February 2008, accompanied by his motor journalist friend, he was given a Ferrari to test drive. The test drive didn’t end well. CCTV cameras captured footage of the car being driven at speeds of between 140-150kph along a 50kph road, serial over-taking, and the eventual head-on smash into the pillar of a bridge above. The motor journalist died at the scene but the driver, remarkably, just about survived. Continue reading
This week there were reports of the amazing advances being made in brain-computer interface (BCI) technology. Following just weeks of training, a 52-year old woman, paralysed from the neck down, was able to use her mind to control a robotic hand to pick up objects on a table, including cones, blocks and small balls, and put them down at another location. She was even able to use the hand to feed herself chocolate.
Having had two arrays of microelectrodes surgically implanted into her left motor cortex, Jan is wired up to a computer that has been programmed to interpret the signals her neurons emit. This computer then passes on the interpreted signals to the robotic arm, which moves in accordance with the signals in real time.
Aside from the awesomeness of the technology, the use of neuroprostheses such as this raises a whole host of interesting philosophical and ethical questions. Particularly as the technology gets more sophisticated and more integrated, the distinction between the machinery being used and the person using it will become increasingly blurred. In the video, Jan already describes how she went from having to ‘think’ the commands (‘clockwise, up, down, forward, back…’) to merely having to ‘look at the target’ to effect accurate movement of the arm. This phenomenon is sometimes labeled ‘transparency of use’, where a tool serves a person’s goals without itself being an object of effortful control. Continue reading
Let me introduce you to Armstrong the Good Giraffe. Appearing in the news last week due to his goodness (and probably his giraffeness), Armstrong is a man in a costume who goes around voluntarily doing good deeds. Throwing himself into helpful tasks – such as providing free water and bananas to runners, picking up litter from beaches, and cleaning cages at cats and dogs homes – Armstrong clocks up an impressive number of non-trivial good deeds. Most impressively of all, he reportedly enjoys it.
He comments that doing these good deeds makes him feel ‘happy’ and ‘cheery’ and that this is why he does them. At first glance, this may make us think he is particularly remarkable: he not only goes about investing more time and energy into being helpful than most would reasonably expect of a person, but he also relishes it. But, I want to ask, are people like Armstrong really at the top of the moral ranks? Is there not something about effort – about having to try – that we value? Continue reading
There has been outrage this week over a new sex education website aimed at young teenagers. Funded by an NHS West Midlands research fund, Respect Yourself has been developed by Warwickshire County Council in collaboration with NHS Warwickshire and Coventry University. The site hosts information about a whole range of topics relating to puberty, sex, bodies, relationships, STIs and contraception, presented in a ‘down-to-earth’ and sometimes humorous way. So why the outrage? Continue reading
There has been discussion on a Polish news site about an extreme case of reckless driving. The discussion is not about the driver – his culpability and stupidity are in no doubt – rather, the discussion is about whether the passengers in the car should be punished in some way for the role they played; their role not only in failing to calm the driver and his driving, but most importantly in their active and enthusiastic encouragement of him and it.
The video of the drive, taken from within the car and uploaded to YouTube, shows five and a half minutes of speeding through red lights, overtaking despite oncoming traffic, using the curb as a ramp to ‘get air’ and, most disturbingly, only narrowly missing a pedestrian crossing the road. All this is accompanied by encouraging whoops and shouts and exclamations of “Karol, you are my God!” (Karol is the driver.) The passengers clearly want – and ask – Karol to take more and more risks. Continue reading
When watching a news report on the recent tragedy in Colorado I was struck by the sight of people using mobile phones to film people leaving the cinema. The state of shock on the people’s faces and the freshness of the blood on their clothes signaled that the event was still unfolding. My first response was surprise that someone would think to start filming in the midst of such circumstances. My second was to wonder whether there were any grounds for objection. The particular video shown on the news was not very graphic, although the fear and confusion was tangible. There may have been more gruesome mobile videos from that day. So, I pose the following questions: is it OK to video horror as it unfolds? Might there even be good reasons to do so? What factors affect the answer to this question? Continue reading