Facebook Accused Of Publishing Child Pornography
Facebook is at risk of a criminal prosecution in Britain for refusing to remove potentially illegal terrorist and child pornography content despite being told it was on the site.
The social media company failed to take down dozens of images and videos that were “flagged” to its moderators, including one showing an Islamic State beheading, several violent pedophilic cartoons, a video of an apparent sexual assault on a child and propaganda posters glorifying recent terrorist attacks in London and Egypt. Instead of removing the content, moderators said that the posts did not breach the site’s “community standards”.
Facebook’s algorithms even promoted some of the offensive material by suggesting that users join groups and profiles that had published it.
A leading QC who reviewed the content said that, in his view, much of it was illegal under British law. Facebook was at risk of committing a criminal offence because it had been made aware of the illegal images and had failed to take them down, he said.
The world’s biggest social network and publisher made $10 billion profit last year by selling advertising targeted at its almost two billion monthly users. Its technology encourages members to expand their friendship networks while offering them a “bespoke” experience based on their interests.
The company has been criticised for allowing jihadists, criminals and paedophiles to thrive on the site, in part encouraged by software that permits them to discover “friends” and groups with similar proclivities.
Recently The Times created a fake profile on Facebook to investigate extremist content. It did not take long to come across dozens of objectionable images posted by a mix of jihadists and those with a sexual interest in children.
“In my view, many of the images and videos identified by The Times are illegal,” Julian Knowles, QC, said. “One video appears to depict a sexual assault on a child. That would undoubtedly breach UK indecency laws. The video showing a beheading is very likely to be a publication that encourages terrorism.
“I would argue that the actions of people employed by Facebook to keep up or remove reported posts should be regarded as the actions of Facebook as a corporate entity. If someone reports an illegal image to Facebook and a senior moderator signs off on keeping it up, Facebook is at risk of committing a criminal offence because the company might be regarded as assisting or encouraging its publication and distribution.”
Posing as an IT professional in his thirties, a Times reporter befriended more than 100 supporters of Isis while also joining groups promoting lewd or pornographic images of children.
Although Facebook removed some of the images, moderators kept online pro-jihadist posts including one praising Isis attacks “from London to Chechnya to Russia and now Bangladesh in less than 48 hours” and promising to bring war “in the heart of your homes”. They also refused to remove an official news bulletin posted by Isis praising the slaughter of 91 “Christian warriors” in the recent terrorist attacks against two churches in Egypt.
Only after being contacted by The Times did Facebook remove a number of the offensive cartoons. The moderators, who are based in Dublin, California, Texas and India, previously kept up a video showing the gruesome beheading of Isis hostages.
Facebook said that it did not contravene its rules against graphic violence despite it showing a British jihadist with his face covered, holding a knife, and standing over a head. “The spark has been lit here in Iraq,” the jihadist said. “Here we are burying the first American crusader.”
Facebook also failed to remove dozens of pornographic cartoons depicting child abuse. Several of the cartoons are likely to be illegal under a 2009 law, yet were freely available on the site. Intermingled with the cartoons, posted on forums with titles such as Raep Me, are pictures of real children, including several likely to be illegal.
One video that was kept up by Facebook appears to show a young child being violently abused. The Times has informed the Metropolitan Police, which co-ordinates counterterrorism investigations, and the National Crime Agency (NCA) about its findings. It will hand over evidence to the NCA this week. A spokesman for the agency said that it would assess any material passed to it relating to child sexual abuse.
A Met spokesman did not say whether Facebook would itself be investigated. Yvette Cooper, chairwoman of the home affairs select committee, said: “Social media companies need to get their act together fast, this has been going on for too long. It’s time the government looked seriously at the German proposal to invoke fines if illegal and dangerous content isn’t swiftly removed.”
Last month Robert Buckland, the solicitor-general, said that social media companies might be breaking British law if they were “reckless” in allowing terrorist material to remain online. Under the Terrorism Act 2006 it is an offence to disseminate terrorist material either intentionally or recklessly.
In the days before The Times contacting Facebook for comment, a number of jihadist videos that had been approved by the site’s moderators were no longer available to view. It is not known why this was the case. The majority of pornographic cartoons remained live until Facebook removed them after the newspaper’s approach.
Justin Osofsky, Facebook’s vice-president of global operations, said: “We are grateful to The Times for bringing this content to our attention. We have removed all of these images, which violate our policies and have no place on Facebook. We are sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.”
You Might Also Read:
Cyber Stalking: It's Real & Potentially Deadly (£):
Tim Berners-Lee’s Vision For The Web - Things Need To Change:
Facebook To Introduce Fake News Tools: