Zuckerberg Has Failed
Uploaded on 2019-02-25 in TECHNOLOGY-Key Areas-Social Media, FREE TO VIEW
Facebook needs far-stricter regulation, with tough and urgent action necessary to end the spread of disinformation on its platform, British MPs have said.
A Commons committee has concluded that the firm's founder Mark Zuckerberg failed to show "leadership or personal responsibility" over fake news. Untrue stories from foreign powers were risking the UK's democracy, they said.
Facebook welcomed the digital select committee's report and said it would be open to "meaningful regulation".
MPs said that what was needed to deal with the proliferation of disinformation online and the misuse of personal data was a "radical shift in the balance of power between social media platforms and the people".
The inquiry into fake news, which lasted more than a year, was conducted by the Digital, Culture, Media and Sport Committee, with much of the evidence focusing on the business practices of Facebook before and after the Cambridge Analytica scandal.
Cambridge Analytica was a political advertising firm that had access to the data of millions of users, some of which was allegedly used to psychologically profile US voters. The data was acquired via a personality quiz. How such data, particularly in terms of political campaigning, was shared by Facebook was at the heart of the inquiry, alongside the effects of fake news.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day," concluded the report.
"The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights."
The report called for:
- a compulsory code of ethics for tech companies, overseen by an independent regulator
- the regulator to be given powers to launch legal action if companies breach the code
- the government to reform current electoral laws and rules on overseas involvement in UK elections
- social media companies to be forced to take down known sources of harmful content, including proven sources of disinformation
- tech companies operating in the UK to be taxed to help fund the work for the Information Commissioner's Office and any new regulator set up to oversee them.
In response, Facebook said: "We share the committee's concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.
"We are open to meaningful regulation and support the committee's recommendation for electoral law reform. But we're not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do."
MPs made no secret of the fact that they found it difficult dealing with Facebook during the inquiry and chair Damian Collins had strong words for the firm and its leader, Mr Zuckerberg.
"We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at time misleading answers to our questions," he said.
"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move fast and break things' culture seems to be that it is better to apologise than ask permission."
MPs were particularly angry that Mr Zuckerberg did not come to the UK to answer questions in person.
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to billions of Facebook users across the world," said Mr Collins.
"Evidence uncovered by my committee shows he still has questions to answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information."
He also accused Facebook of "bullying" smaller tech firms and developers who rely on their platform to reach users.
The committee did not list specific examples of fake news. But it pointed to the government response to its interim report, which found at least 38 false narratives online after the nerve agent attack in Salisbury in March 2018.
The report also noted that disinformation was not just spread on Facebook but also on platforms such as Twitter. It also found that, in the month following the publication of its interim report, 63% of the views to the online government response were from foreign internet protocol (IP) addresses, more than half of which were from Russia, highly unusual for a UK-based political inquiry.
MPs said current electoral regulations were "hopelessly out of date for the internet age" and needed urgent reform, so that the same principles of transparency of political communications that operate in the real world were applied online too.
The committee called on the government to reveal how many investigations were currently being carried out into Russian interference in UK politics, particularly the EU referendum in 2016. They asked the government to launch an independent investigation into that.
In order to better regulate social media firms, the MPs suggested creating a new category of tech firm - one that was neither a platform nor a publisher but something in-between, which would tighten the legal liability for content identified as harmful.
Fact-Checkers
Pressure is mounting on the tech giants to get to grips with the issue of fake news, and will add to calls from other ministers for regulation on the issue of harmful content. In her report, Dame Frances Cairncross said such sites should help users identify fake news and "nudge people towards news of high quality".
Facebook has repeatedly said it is committed to fighting fake news and works with more than 30 fact-checking organisation around the world.
Two of those agencies, Associated Press and Snopes, recently quit working with the social network. The ease with which fake news can be created was illustrated recently by a team of researchers at OpenAi which showed a machine learning system produce coherent, but untrue articles, just by trawling through news site Reddit.
BBC:
You Might Also Read:
Facebook CEO Zuckerberg Backed Sharing Customer Data: