The Home Office launched a new Directgov site last week, which "provides members of the public with information about what they can do if they come across violent extremist, terrorist and hate content online" (press release). The site takes reports and forwards them to a specialist unit within Association of Chief Police Officers (ACPO), which will take action if the material is illegal. Unsurprisingly there has been a good deal of media coverage (e.g. The Register | The Inquirer | BBC News). So far, though, there doesn't seem to have been any assessment of how this fits into the broader matrix of internet regulation in the UK. This post asks what effect it might have.
Reducing the role of the IWF?
One of the more significant aspects of this story is that it appears to be the first time that the UK government has set up a specific site to which internet content can be reported. Until now, the government has effectively devolved that function to the Internet Watch Foundation (IWF). Although this is a private body, official policy has been to designate the IWF as the first port of call for online content. The Surrey Police website is typical:
If you come across offensive or illegal material, please DO NOT contact Surrey Police directly.
Instead, you can make a report on the Internet Watch Foundation (IWF) web site.
If they decide any action is needed, they will contact the ISP or the police, who can take appropriate action. (It's worth remembering that evidence of illegal or offensive material can be detected even after it has been deleted from a computer.)
The Internet Watch Foundation are qualified to judge the illegality of material and will report matters to the relevant police force. They are the only authorised organisation in the UK that provides an Internet hotline for the public to report their exposure to illegal content online.Despite this, however, the IWF has never had a remit to receive complaints in relation to all illegal material online. For example, while there have been proposals from the Home Office that the IWF's remit should be extended to cover extremist websites, these have never come to fruition. Similarly, when the Terrorism Act 2006 created a system of notifying ISPs to take down terrorist material, that system bypassed the IWF entirely and required that notices be given via the police.
Consequently, the setting up of this site may be significant - does it indicate a trend which moves away from government reliance on the IWF and towards the use of separate (and public) reporting mechanisms?
Content control as a means of protecting vulnerable people?
The rhetoric used in announcing the site is also interesting. According to Lord West:
We want to protect people who may be vulnerable to violent extremist content and will seek to remove any unlawful material.If this sounds familiar, that's because it echoes the justifications for introducing the Cleanfeed child abuse image blocking system and later for criminalising extreme pornography - in each case, a central component was the argument that harm would be caused to the viewer (by simply viewing the material, or by predisposing them to commit crimes). Is this approach - focusing on harm to the viewer - becoming more common in controlling content in the UK?
Using consumer pressure as a regulatory tool?
Quite apart from illegal content, the site also sets out to encourage users to challenge content which is legal. According to Lord West:
This is also about empowering individuals to tell them how they can make a civic challenge against material that they find offensive, even if it is not illegal.Consequently, the site provides information on how to make complaints:
The internet is not a lawless forum and should reflect the legal and accepted boundaries of society.
What you can do about online hate or violence that is not illegalThis approach - by encouraging community pressure to force ISPs to change their behaviour - matches policy in relation to blocking, where the Home Office has abandoned plans to legislate and has instead stated its intention to rely on public pressure instead:
Most hateful or violent website content is not illegal. While you may come across a lot of things on the internet that offend you, very little of it is actually illegal.
UK laws are written to make sure that people can speak, and write, freely without being sent to prison for their views.
To be illegal, the content must match the descriptions at the top of this page.
Still, even if what you’ve seen does not seem to be illegal, you can take the steps below to have it removed if it upsets, scares or offends you.
Report it to the website administrator
Most websites have rules known as ‘acceptable use policies’ that set out what cannot be put on their website. Most do not allow comments, videos and photos that offend or hurt people...
If what you’ve seen is on a site with a good complaints system, you should report it to the website’s owners. Look out for their ‘contact us’ page, which should be clearly linked...
Report it to the hosting company
If the website itself is hateful or supports violence or terrorism let the website’s hosting company know. Hosting companies provide a place where the website sits, and often have rules about what they are willing to host.
Let the hosting company know they are hosting a website that breaks their rules, and ask them to stop.
You can find out which company hosts a website by entering their web address on the ‘Who is hosting this?’ website.
For the first time the IWF will publish the list of ISPs who are certified as having implemented its blacklist. "Hopefully consumer and public pressure will encourage the ISPs who aren't on the list to comply," said Carr. A Home Office spokesman said: "We will continue to urge ISPs to implement blocking, and ask consumers to check with their suppliers that they have done so."Does this mark the start of a trend towards greater use of consumer pressure by the UK government as a means of regulating what ISPs do?