Skip to content

Open source censorship

The struggle against child porn goes on. An Australian judge has ruled that a cartoon showing a character from The Simpsons engaged in sexual activity is child pornography. Australia is also trying to implement Internet filtering for the whole population, although the project has run into serious opposition. Meanwhile Wikipedia ended up 'censored' in the UK due to a page with the controversial cover of an album.

The most interesting aspect of the Wikipedia debacle is that the decision that led to the censorship was not made by any government authority but an industry-sponsored group, the Internet Watch Foundation. The IWF maintains a blacklist certain ISPs subscribe to, and users trying to reach a site on it will be sent a blank page. Is this censorship, and is this bad?

Censorship can be morally permissible and possibly even an obligation if possible, such as in the case of dangerous or deeply personal information, where censorship prevents harm (to society or individuals). Moral censorship is often more problematic, but any given society will likely have some taboos strong enough to convince the majority of people that it is a good thing to censor certain kinds of informaton – in the case of the West this is definitely true for child pornography. In the case of child pornography there are also strong reasons to prevent it from stimulating criminal sexual intent that would not otherwise exist, to protect the privacy of the victim and to limit the abuse industry. These reasons may not be entirely watertight, especially in regards to the Simpsons cartoon picture (no victim, unlikely to stimulate criminal sexuality and likely not produced for profit) but we can for the sake of argument assume that censorship of child porn is relatively unproblematic.

However, much material exists that is deeply distasteful without being illegal or immoral. The album image in question has been around since the 1970's and exists on numerous sites (including Amazon.com) and in books in libraries without any legal action. There are also legal images that are hard to distinguish from illegal images. The IWF defends itself by arguing that the album is a "potentially illegal image" – rather than wait until a court declares it unlawful ISPs should block it to protect themselves from being accused of spreading it. Determining what is potentially illegal is a complex judgement that seems to presuppose an understanding of law and relevant social norms. Given that police forces appear uncertain about the content of the law and IWF themselves appears inconsistent on the distinction legal/illegal content, it is likely that a blocking group would have to err widely on the side of caution.

Several countries maintain filtering lists maintained by the police. However, there are also other groups that act as online enforcement. Some groups maintain blacklists of domains
that send spam or otherwise misbehave, and these lists are then used by
some ISPs and networks to block email or access to such domains. These lists are controversial because of their sometimes arbitrary or biased contents, as well as the problem for innocents to be delisted. The blacklist organisations defend themselves by arguing that they are only providing the list, leaving the actual enforcement up to ISPs or institutions.However, UK ISPs are required by law to block content blacklisted by IWF.

The key problem with blacklists is that they tend to be blunt and unaccountable instruments. On one hand illegal or undesirable content that has not been reported will not be stopped, on the other hand they can easily stop innocuous content. Often the blocking hits all sites on the same server as an objectionable page, and may thus do harm by implying that innocent sites are criminal. While it is possible to create software that semi-automatically or automatically adds pages to the blacklist this often makes mistakes, such as blocking access to recipes of turkey breast or Amnesty International. More worryingly, there have been several instances of blocking sites critical of internet filtering or political sites. Not only commercial companies are blocking critics: the Finnish police are blocking access to an anti-censorship site.

The blacklists lack transparency and accountability: outsiders are not allowed to see them (ostensibly because they contains links to illegal material), the criteria for inclusion are usually secret and the process for innocently blocked to correct an error are often cumbersome or arbitrary.

So even if one grants that child pornography should be censored, it seems that the kind of proactive censoring done using blacklist is overly broad and has bad side effects: innocent sites are censored, the method of censorship leads itself to abuse by various actors, the lack of transparency and accountability prevents the abuse from being adressed. The strong moral reason for the initial censorship tends to deflect criticism of the methods, enabling mission creep. Since anything that could be objectionable to any blacklisting party also runs the risk of being censored, there is also a chilling effect on freedom of expression.

At the same time blacklisting pages largely miss what ought to be the goal: to reduce harm and to punish wrongdoers. The majority of child pornography online is apparently spread using P2P networks these days, not the web (this is one of the many arguments against the Australian filtering scheme). Taking down illegal pages and punishing the parties responsible is a more effective way of reducing harm and serving justice than merely blocking access. Blocking access may protect the privacy of the child and possibly avoid stimulating the wrong urges in the viewer, but that should always be just a first step on the way towards either permanent removal or a speedy, transparent acquittal as being disgusting but legal.

In recent years various private or industry groups have acquired what
is de facto law enforcement powers in certain domains. Organisations
such as IWF , ECPAT and music, movie and software industry associations have become
intimately allied with law enforcement agencies. They are not just
contributing expertise but also setting agendas, triggering
investigations and supporting prosecution. The conflict of interest of
the content industry is obvious. IWF is funded by the Internet industry, and might be biased towards actions that prevent legal problems for the industry (e.g. accusations that it is soft on pornography) rather than serving the interests of the public. 

Censorship is a dangerous power that must be handled carefully in any open society. Putting it under the control of an industry group with little transparency or accountability is a serious mistake. Either IWF should only be allowed to blacklist content that has to be found to be illegal, or it should be made part of government or a public body with appropriate oversight. If there is anything that must be open to intense scrutiny, it is censorship.

Share on

1 Comment on this post

  1. The problem with ‘punishing the parties’ is that a website may not fall under the jurisdiction of a particular country. Australia can’t prosecute a guy putting up child porno on a server in Russia. That’s why blocking is used. But I agree that blocking is often arbitrary and can claim innocent sites as victims.

Comments are closed.