Skip to content

Hannah Maslen

‘Losses disguised as wins’: Slot machines and deception

Last week, Canadian researchers published a study showing that some modern slot machines ‘trick’ players – by way of their physiology – into feeling like they are winning when in fact they are losing. The researchers describe the phenomenon of ‘losses disguised as wins’, in which net losses involving some winning lines are experienced in the same way as net wins due to physiological responses to the accompanying sounds and lights. The obvious worry is that players who are tricked into thinking they’re winning will keep playing longer and motivate them to come back to try again.

The game set up is as follows: players bet on 15 lines simultaneously, any of which they might win or lose. A player will accrue a net profit if the total amount collected from all winning lines is greater than the total amount wagered on all 15 lines. Such an outcome is accompanied by lights and sounds announcing the wins. However, lights and sounds will also be played if any of the lines win, even if the net amount collected is less than the total amount wagered on all 15 lines. If a player bets 5 credits per line (5 x 15 = 75) and wins 10 back from 3 (= 30), then the player has actually lost money, even though the lights and sounds indicate winning. The loss, the researchers claim, is thus disguised as a win.Read More »‘Losses disguised as wins’: Slot machines and deception

Skin switching, implicit racial bias and moral enhancement

A recent study has shown that a person’s implicit racial bias can be reduced if she spends some time experiencing her body as dark-skinned. Psychologists in Spain used an immersive virtual reality technique to allow participants to ‘see’ themselves with a different skin colour. They measured the participants’ implicit racial bias before and after the intervention, finding that the embodiment of light-skinned individuals in a dark-skinned virtual body at least temporarily reduced their implicit bias against people who are coded as ‘out-group’ on the basis of skin colour.

Implicit racial bias is an evolved, unconscious tendency to feel more positively towards members of one’s own race (one’s ‘in-group’) than towards members of a different race (members of an ‘out-group’). The bias can be (and was in this study) measured using a version of the implicit association test, which requires participants to quickly catagorise faces (black or white) and words (positive or negative) into groups. Implicit bias is calculated from the differences in speed and accuracy between categorising (white faces, positive words) and (black faces, negative words) compared to (black faces, positive words) and (white faces, negative words). Crucially, implicit racial bias has been shown to be uncorrelated with explicit racial bias – self-reports of negative racial stereotypes. This means that even those who are not consciously averse to people from other racial groups often demonstrate a deep-seated bias against them as an evolutionary hangover. Hearteningly, the authors of the study started from the idea that encoding people by race may be a reversible by-product of human evolution used to detect coalitional alliances. What their study confirmed is that immersive virtual reality provides a powerful tool for placing people into a different race ‘coalition’ by changing their body representation and consequently reducing their implicit aversion to the racial characteristics there represented.Read More »Skin switching, implicit racial bias and moral enhancement

Phones 4 U, Ke$ha and becoming offensive

Channel 4 was censured by Ofcom this week for cutting to a light-hearted sponsorship advert just after viewers had watched the particularly graphic and disturbing rape scene in the film The Girl with the Dragon Tattoo. The Phones 4 U sponsorship ad was thought to be especially inappropriate for that moment as it features a couple apparently having sex, during which the woman pauses and asks to the camera ‘I’m faking it, can I upgrade’? Ofcom received 17 complaints about the timing of the advert and this week concluded that ‘the juxtaposition of a light-hearted sponsorship credit featuring a woman during sex with a disturbing and distressing rape scene in a film was clearly unsuitable… In Ofcom’s view this clearly had the potential to be offensive to viewers’.

The timing was clearly unfortunate, but to say that the juxtaposition was offensive is a stronger claim.  Of course, the psychological effect of being immersed in a violent scene at one moment and then confronted with the same(ish) subject matter presented trivially will not do much for the viewer’s aesthetic experience. But the regulator’s suggestion seemed not only to be that the juxtaposition detracted from the viewer’s enjoyment, but also that it was in some way wrong.Read More »Phones 4 U, Ke$ha and becoming offensive

Strict-ish liability? An experiment in the law as algorithm

Some researchers in the US recently conducted an ‘experiment in the law as algorithm’. (One of the researchers involved with the project was interviewed by Ars Technia, here.) At first glance, this seems like quite a simple undertaking for someone with knowledge of a particular law and mathematical proficiency: laws are clearly defined rules, which can be broken in clearly defined ways. This is most true for strict liability offences, which require no proof of a mental element of the offence (the mens rea). An individual can commit a strict liability offence even if she had no knowledge that her act was criminal and had no intention to commit the crime. All that is required under strict liability statutes is that the act itself (the actus reus) is voluntary. Essentially: if you did it, you’re liable – it doesn’t matter why or how. So, for strict liability offences such as speeding it would seem straightforward enough to create an algorithm that could compare actual driving speed with the legal speed limit, and adjudicate liability accordingly.

This possibility of law as algorithm is what the US researchers aimed to test out with their experiment. They imagined the future possibility of automated law enforcement, especially for simple laws like those governing driving. To conduct their experiment, the researchers assigned a group of 52 programmers the task of automating the enforcement of driving speed limits. A late-model vehicle was equipped with a sensor that collected actual vehicle speed over an hour-long commute. The programmers (without collaboration) each wrote a program that computed the number of speed limit violations and issued mock traffic tickets.Read More »Strict-ish liability? An experiment in the law as algorithm

A reply to ‘Facebook: You are your ‘Likes”

Yesterday, Charles Foster discussed the recent study showing that Facebook ‘Likes’ can be plugged into an algorithm to predict things about people – things about their demographics, their habits and their personalities – that they didn’t explicitly disclose. Charles argued that, even though the individual ‘Likes’ were voluntarily published, to use an algorithm to generate further predictions would be unethical on the grounds that individuals have not consented to it and, consequently, that to go ahead and do it anyway is a violation of their privacy.

I wish to make three points contesting his strong conclusion, instead offering a more qualified position: simply running the algorithm on publically available ‘Likes’ data is not unethical, even if no consent has been given. Doing particular things based on the output of the algorithm, however, might be.Read More »A reply to ‘Facebook: You are your ‘Likes”

On being private in public

We all know that we are under CCTV surveillance on many occasions each day, particularly when we are in public places. For the most part we accept that being – or potentially being – watched in public places is a reasonable price to pay for the security that 24-hour surveillance offers. However, we also have expectations about what is done with CCTV footage, when, and by whom. A recent discussion with a friend threw up some interesting questions about the nature of these expectations and their reasonableness.

My friend works in a bar where, unsurprisingly, there are several CCTV cameras. Everyone knows where these cameras are and that they are permanently in operation – there is not supposed to be any secrecy. Whilst the primary purpose of the cameras is to promote security, a member of the management team has begun to use them in a way that could be seen as ethically problematic: she logs on to view the footage in real-time, remotely, at her home. In addition to watching the footage, the manager has also addressed points of staff discipline based on what she sees. Perhaps particularly troubling is that she has commented on the way a member of staff behaved when no one was around – when the member of staff thought that she was ‘alone’.Read More »On being private in public

Amnesia and remorse: how much should we expect?

Photograph: Filip Klimaszewski / Agencja Gazeta

When people do bad things – especially when they cause a lot of harm to others – we usually hope that they will experience something like remorse: that they will feel horror at the thought of what they did to the person harmed, that they will resolve to avoid causing similar harm in the future, and that they will be motivated to apologise and offer reparation, where possible. Penal systems in some jurisdictions deem remorse so important that it is considered a valid reason to mitigate the amount of punishment the offender receives. But, what happens to our expectations for emotion if the person cannot remember committing the offence; if he feels so detached from it that it is as if he did not commit it? An interesting case from Poland raises this question.

Maciej Zientarski was a celebrity driver on a TV programme similar to our Top Gear. On the 27th February 2008, accompanied by his motor journalist friend, he was given a Ferrari to test drive. The test drive didn’t end well. CCTV cameras captured footage of the car being driven at speeds of between 140-150kph along a 50kph road, serial over-taking, and the eventual head-on smash into the pillar of a bridge above. The motor journalist died at the scene but the driver, remarkably, just about survived. Read More »Amnesia and remorse: how much should we expect?

Mind-controlled limbs and redefining the self

Image credit: University of Pittsburgh/UPMC

This week there were reports of the amazing advances being made in brain-computer interface (BCI) technology. Following just weeks of training, a 52-year old woman, paralysed from the neck down, was able to use her mind to control a robotic hand to pick up objects on a table, including cones, blocks and small balls, and put them down at another location. She was even able to use the hand to feed herself chocolate.

Having had two arrays of microelectrodes surgically implanted into her left motor cortex, Jan is wired up to a computer that has been programmed to interpret the signals her neurons emit. This computer then passes on the interpreted signals to the robotic arm, which moves in accordance with the signals in real time.

Aside from the awesomeness of the technology, the use of neuroprostheses such as this raises a whole host of interesting philosophical and ethical questions. Particularly as the technology gets more sophisticated and more integrated, the distinction between the machinery being used and the person using it will become increasingly blurred. In the video, Jan already describes how she went from having to ‘think’ the commands (‘clockwise, up, down, forward, back…’) to merely having to ‘look at the target’ to effect accurate movement of the arm. This phenomenon is sometimes labeled ‘transparency of use’, where a tool serves a person’s goals without itself being an object of effortful control.Read More »Mind-controlled limbs and redefining the self

Armstrong the Good Giraffe and the Moral Value of Effort

Let me introduce you to Armstrong the Good Giraffe. Appearing in the news last week due to his goodness (and probably his giraffeness), Armstrong is a man in a costume who goes around voluntarily doing good deeds. Throwing himself into helpful tasks – such as providing free water and bananas to runners, picking up litter from beaches, and cleaning cages at cats and dogs homes – Armstrong clocks up an impressive number of non-trivial good deeds. Most impressively of all, he reportedly enjoys it.

He comments that doing these good deeds makes him feel ‘happy’ and ‘cheery’ and that this is why he does them. At first glance, this may make us think he is particularly remarkable: he not only goes about investing more time and energy into being helpful than most would reasonably expect of a person, but he also relishes it. But, I want to ask, are people like Armstrong really at the top of the moral ranks? Is there not something about effort – about having to try – that we value?Read More »Armstrong the Good Giraffe and the Moral Value of Effort

Too much too young?

There has been outrage this week over a new sex education website aimed at young teenagers. Funded by an NHS West Midlands research fund, Respect Yourself has been developed by Warwickshire County Council in collaboration with NHS Warwickshire and Coventry University.  The site hosts information about a whole range of topics relating to puberty, sex, bodies, relationships, STIs and contraception, presented in a ‘down-to-earth’ and sometimes humorous way. So why the outrage?Read More »Too much too young?