Skip to content

algorithms

Oxford Uehiro Prize in Practical Ethics: What, if Anything, is Wrong About Algorithmic Administration?

  • by

This essay received an honourable mention in the undergraduate category.

Written by University of Oxford student, Angelo Ryu.

 

Introduction

 The scope of modern administration is vast. We expect the state to perform an ever-increasing number of tasks, including the provision of services and the regulation of economic activity. This requires the state to make a large number of decisions in a wide array of areas. Inevitably, the scale and complexity of such decisions stretch the capacity of good governance.

In response, policymakers have begun to implement systems capable of automated decision making. For example, certain jurisdictions within the United States use an automated system to advise on criminal sentences. Australia uses an automated system for parts of its welfare program.

Such systems, it is said, will help address the costs of modern administration. It is plausibly argued that automation will lead to quicker, efficient, and more consistent decisions – that it will ward off a return to the days of Dickens’ Bleak House.Read More »Oxford Uehiro Prize in Practical Ethics: What, if Anything, is Wrong About Algorithmic Administration?

Cross Post: Biased Algorithms: Here’s a More Radical Approach to Creating Fairness

  • by

Written by Dr Tom Douglas

File 20190116 163283 1s61b5v.jpg?ixlib=rb 1.1

Our lives are increasingly affected by algorithms. People may be denied loans, jobs, insurance policies, or even parole on the basis of risk scores that they produce.

Yet algorithms are notoriously prone to biases. For example, algorithms used to assess the risk of criminal recidivism often have higher error rates in minority ethic groups. As ProPublica found, the COMPAS algorithm – widely used to predict re-offending in the US criminal justice system – had a higher false positive rate in black than in white people; black people were more likely to be wrongly predicted to re-offend.

Corrupt code.
Vintage Tone/Shutterstock

Read More »Cross Post: Biased Algorithms: Here’s a More Radical Approach to Creating Fairness

Scrabbling for Augmentation

By Stephen Rainey

 

Around a decade ago, Facebook users were widely playing a game called ‘Scrabulous’ with one another. It was pretty close to Scrabble, effectively, leading to a few legal issues.

Alongside Scrabulous, the popularity of Scrabble-assistance websites grew. Looking over the shoulders of work colleagues, you could often spy a Scrabulous window, as well as one for scrabblesolver.co.uk too. The strange phenomenon of easy, online Scrabulous cheating seemed pervasive for a time.

The strangeness of this can hardly be overstated. Friends would be routinely trying to pretend to one another that they were superior wordsmiths, by each deploying algorithmic anagram solvers. The ‘players’ themselves would do nothing but input data to the automatic solvers. As Charlie Brooker reported back in 2007,

“We’d rendered ourselves obsolete. It was 100% uncensored computer-on-computer action, with two meat puppets pulling the levers, fooling no one but themselves.”

Back to the present, and online Scrabble appears to have lost its sheen (or lustre, patina, or polish). But in a possible near future, I wonder if some similar issues could arise.Read More »Scrabbling for Augmentation