Thursday, December 11, 2008

Was it right to censor a Wikipedia page?

Was it right to censor a Wikipedia page?
By Struan Robertson, legal director, international law firm Pinsent Masons

Published: December 11 2008 16:37 | Last updated: December 11 2008 16:37

At the moment, most people cannot edit the page on Wikipedia that describes the Internet Watch Foundation. The “free encyclopaedia that anyone can edit” is wisely restricting who can edit that page following an outbreak of what it calls “vandalism”. Right now, the IWF is very unpopular.

The IWF exists to minimise the availability of indecent images of children on the internet. But its existence is being challenged over its handling of a recent complaint. It was about an image that appeared on another Wikipedia page, concerning an album by German rock band Scorpions. The image featured a naked young girl. It was the original sleeve design for the band’s 1976 album “Virgin Killer”.

The IWF deemed the image potentially illegal and added the page on which it appeared to a blacklist that is enforced by all of the UK’s major internet service providers (ISPs). That action prevented most UK internet users from accessing the Wikipedia page and, due in part to the way that Wikipedia functions, it had the unintended side-effect of stopping those users from editing any of the millions of Wikipedia pages.

The web community erupted in fury. Comments on blogs were overwhelmingly anti-IWF. Fairly quickly, the IWF gave in to some of the criticism. It removed the image from its blacklist, though, after consulting with senior police officers, reiterated its view that the image is potentially illegal.

In my view, the IWF was right to put the image on its blacklist. It was wrong to remove it from the list.

Many people were alarmed to learn this week that there are restrictions on their freedom to surf the internet. Wikimedia, the non-profit operator of Wikipedia, said that this was the first time its site has been censored in the UK. It noted that it has been censored at various times in China, Syria and Iran.

Like it or not, censorship exists in the UK. The right to freedom of expression in the country is a qualified right. The European Convention of Human Rights provides that it can be restricted by laws to protect morals, the reputation or rights of others or laws to prevent the leak of confidential information. Laws on child protection, defamation, intellectual property and confidence all contain a right to suppress online material – which falls within Wikipedia’s own definition of censorship.

Other critics dislike that the IWF decides what to censor. They are right that it should not make that decision if it fails to perform that duty well. But do not judge it on the basis of one decision. It assessed 35,000 complaints in the course of last year alone (and in two-thirds of cases the image was deemed lawful).

The blacklist is kept secret for obvious reasons, but internet users who try to visit the pages on it will be oblivious to the censorship – and I think this is a mistake. ISPs present an error message that does not disclose the censorship. While the IWF can’t control that message, it could encourage transparency.

There is another problem with the IWF’s model: it bans pages, not the images themselves. It says this approach is simpler and more effective, though I confess that I don’t understand why. Still, if that policy is disproportionate it is only slightly so: it did not blacklist an entire site.

The IWF was set up by the UK’s internet industry. It began as a “notice and takedown” body for images of child abuse that are hosted in the UK, giving the public a hotline for reporting illegal images. Web hosts do not know what images are on their customers’ sites and do not need to, provided they react quickly when alerted to potentially illegal content. Only a court can officially declare an image illegal but hosts cannot afford to await that declaration – otherwise it may come at their own trial on charges that carry a maximum 10-year sentence. And they don’t want to give their own staff the job of receiving complaints with images of child abuse attached, so they outsourced the bulk of the work of receiving and assessing complaints – and the IWF was born.

When child abuse images are hosted in the UK, the IWF can identify and notify the host and the host removes them. Its work has cut dramatically the volume of illegal images hosted in the UK.

The IWF’s more controversial operation is the maintenance of its blacklist. When images are found to be hosted outside the UK, the IWF cannot ensure they are taken down. It reports the image to equivalent bodies and law enforcement in the hosting country and adds the URL to its blacklist, a list it updates twice a day. That list is followed by most ISPs in the UK, mainly to prevent their customers stumbling upon illegal images, images which are illegal to view.

Many people don’t like ISPs using this blacklist to censor their surfing. But ISPs do it in the knowledge that if they don’t, the government will intervene. Vernon Coaker, Minister of State for policing, security and crime, said so. He set a target for the end of 2007 for all ISPs to put in place technical measures ”that prevent their customers accessing websites containing illegal images of child abuse identified by the IWF”.

The IWF does not lobby for legal reform – it just interprets the legislation and court rulings that exist. It has a small team of analysts who train with the police and are experts in assessing content in line with those laws. The government trusts it to do this job.

Other industries have their own self-regulatory bodies staffed by experts in the field. The Advertising Standards Authority can ban adverts from TV without a court ruling. Spamhaus blacklists spammers to protect e-mail inboxes. Such bodies are accountable to their industries and the IWF is no different. If it fails in its duty, ISPs can kill it. If they do, they can either replace it or the government will replace it for them.

Some people have said that the Scorpions’ image is harmless in their opinion. My advice: consider it illegal because the IWF’s opinion is likely to be, in legal terms, better informed – and therefore more influential in the mind of a judge. So don’t go looking for it. It was blacklisted because it fails a test of the Protection of Children Act (a law that did not exist at the time of the album’s release). That test says that an erotically posed photograph of an under-18 will constitute an indecent and therefore illegal image. It will not censor Michelangelo’s David or a cartoon, because it is limited to photographs and pseudo-photographs. It is unlikely to censor the album cover of Nirvana’s Nevermind, on which a naked baby swims towards a dollar bill, because courts have never interpreted such images as erotically posed.

The IWF deemed the Scorpions’ image sexually provocative. It invoked its appeals procedure after Wikimedia complained and it upheld its original ruling. But the IWF decided to remove the page from its blacklist ”in light of the length of time the image has existed and its wide availability”, it said. Further reports of the image being hosted abroad will not be added to its blacklist, though if the image is found to be hosted in the UK, it ”will be assessed in line with IWF procedures”.

”IWF’s overriding objective is to minimise the availability of indecent images of children on the internet, however, on this occasion our efforts have had the opposite effect,” it said.

I think it was right to blacklist the image after it deemed the image illegal. But a dangerous message is sent by the decision to remove the image from its blacklist. If an image is illegal, should its age or wide dissemination excuse repeated publication? The image cannot be erased from every corner of the internet and prosecutors will not be taking action against all those who own a copy on album sleeve or hard drive. But the IWF’s job is to minimise exposure to illegal images – not to eliminate it. Under enormous pressure I think it lost sight of the distinction.

Censorship takes place in the UK every day, for legal, moral and commercial reasons. When Wikipedia blocks those who vandalise its pages or deletes their hateful comments, it too engages in censorship. Internet companies engage in censorship because they have to – and they outsource part of that burden to the IWF. This incident has focused attention not just on a 1970s album cover. Clearly some people dislike our laws, our industry’s preference for self-regulation and/or the operation of the IWF, though they have failed to offer a credible alternative, one that the industry and government would support.

Balancing our freedom of expression with the protection of children is difficult and important. It is a healthy issue to debate. But like any Wikipedia article, that debate needs some balance. In this case, that balance was missing.

Copyright The Financial Times Limited 2008

No comments: