British Government Is Planning Internet Regulation
The UK government is preparing to establish a new internet regulator that would make tech firms liable for content published on their platforms and have the power to sanction companies that fail to take down illegal material and hate speech within hours.
Under legislation being drafted by the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) due to be announced this winter, a new regulatory framework for online “social harms” would be created.
BuzzFeed News has obtained details of the proposals, which would see the establishment of an internet regulator similar to Ofcom, which regulates broadcasters, telecoms, and postal communications. Home secretary Sajid Javid and culture secretary Jeremy Wright are considering the introduction of a mandatory code of practice for social media platforms and strict new rules such as "takedown times" forcing websites to remove illegal hate speech within a set timeframe or face penalties.
Ministers are also looking at implementing age verification for users of Facebook, Twitter, and Instagram. A promise to regulate the Internet was buried at the back of the Conservative manifesto for last year's general election. "Some people say that it is not for government to regulate when it comes to technology and the Internet," the manifesto stated. "We disagree."
The new proposals are still in the development stage and are due to be put out for consultation later this year. A spokesperson for the government confirmed it is "considering all options", including a regulator.
The planned regulator would have powers to impose punitive sanctions on social media platforms that fail to remove terrorist content, child abuse images, or hate speech, as well as enforcing new regulations on non-illegal content and behaviour online.
The rules for what constitutes non-illegal content will be the subject of what is likely to be a hotly debated consultation. The regulator would be ultimately accountable to parliament.
Ministers are thought to looking at creating a second new regulator for online advertising. Its powers would include a crackdown on online advertisements for food and soft drink products that are high in salt, fat, or sugar.
Currently, online advertisements are regulated by the Advertising Standards Authority. Government sources have indicated that their frustration that the tech industry has failed to take voluntary action to promote online safety has led them to pursue a mandatory approach.
When the previous culture secretary, Matt Hancock, invited 14 tech companies in for talks on online safety earlier this year, only four firms turned up. Ministers have concluded that the voluntary approach only achieved progress with a few tech giants over specific issues such as terrorist content, and that new laws are required to force smaller and medium-sized social media platforms to take action against a wider range of content.
This will involve producing a single legal framework for internet safety and increasing the legal liability for sites that provide a platform for illegal content.
Social media companies will be forced to sign up to a code of practice and new requirements to assist the police in investigating criminal activity online.
The government is looking at legislation passed in Germany last year requiring social media platforms to remove illegal hate speech within 24 hours or face fines of up to 50 million euros.
The German law was vociferously opposed by human rights groups and industry representatives who warned it would lead to censorship and an unmanageable burden on smaller websites. It encountered problems in its first few months when a satirical magazine and a political street artist had their content blocked.
The government is set to introduce age verification for social media platforms, after ministers raised concerns that children are currently only required to tick a box saying they are over the age of 13. DCMS has previously indicated that it would seek to impose mandatory transparency reports on social media platforms and implement the recommendations of the Law Commission review into online communications. It has also introduced legislation to block pornography sites that refuse to use age verification controls.
The proposal for a new regulator of online social harms will raise significant questions about the government’s ability to make tech firms based outside the UK liable for content that users post on their platforms, how hate speech will be sanctioned, and how to determine what non-illegal content merits state-backed oversight. Concerns have been raised in Whitehall that the regulation of non-illegal content will spark opposition from free speech campaigners and MPs.
There are also fears internally that some of the measures being considered, including blocking websites that do not adhere to the new regulations, are so draconian that they will generate considerable opposition.
Recently, the head of Ofcom, Sharon White, called for tech companies to be regulated in the same way as the mobile phone and broadband industries.
"The boundaries between broadcasting and the online world are being redrawn. This has implications for the public’s understanding of what protections apply online, versus traditional media," White said, arguing that "certain principles from broadcasting regulation could be relevant as policymakers consider issues around online protection".
But doubts have long been raised inside the tech industry as to how state-backed regulation of social media would work, the resources it would require to do so successfully, and whether it is even possible to impose a legal framework on firms headquartered in the US and elsewhere.
A government spokesperson confirmed that the plans would be unveiled later this year. "This winter we will publish a White Paper, setting out new laws to tackle the full range of online harms and set clear responsibilities for tech companies to keep UK citizens safe," they said. "We are considering all options, including what legislation will be necessary and whether a regulator is needed."
You Might Also Read:
Regulation Might Actually Protect Facebook:
Bashing Facebook Is Not The Answer To Curbing Russian Influence Operations: