Leaked Facebook Documents Reveal Discriminatory Practices
Facebook is facing a series of accusations about its internal workings first published by the Wall Street Journal (WSJ). Some of these revelations come from Facebook's own internal documents, suggesting the company now has some whistle-blowers in its ranks. This week Frances Haugen, a data scientist and a former product manager, heavily criticised her ex-employer at a Congressional hearing in Washington.
At the same time, new information has emerged on Facebook’s ‘cross-check’ system, which the company uses to review content decisions relating to some high-profile users. According to the Wall Street Journal, many celebrities, politicians and high-profile Facebook users had different rules governing what content they can post, under a system known as XCheck (cross-check).
Facebook has admitted criticism of the way it implemented its cross-check system was "fair", but said it was designed to create "an additional step" when posted content required more understanding. "This could include activists raising awareness of instances of violence or journalists reporting from conflict zones," it said. It also said that a lot of documents referred to by the Wall Street Journal contained "outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross-check and has been working to address them".
Facebook's own Oversight Board, which it appointed to make decisions on tricky content moderation, has demanded more transparency.
Since it began its work looking into how Facebook moderates content, the Facebook-funded Oversight Board has made 70 recommendations about how the company should improve its policies. It has now set up a team to assess how the social network implements those recommendations. The documents reported by the WSJ also suggested Facebook employees regularly flagged information about drug cartels and human traffickers on the platform but the company's response was "weak".
In November 2019, the BBC highlighting the issue of domestic Arabic workers for sale on Instagram.According to internal documents, Facebook was already well aware of the issue. The WSJ reported that Facebook took only limited action until Apple threatened to remove its products from its App Store. In its defence, Facebook said it had a "comprehensive strategy" to keep people safe including "global teams with native speakers covering over 50 languages, educational resources and partnerships with local experts and third-party fact-checkers".
Critics warn that Facebook does not have the means to moderate all the content on its platform and protect its 2.8 billion users.
Another significant revelation from Frances Haugen's testimony was that the company had conducted detailed research into how Instagram was affecting teenagers but did not share its findings when they suggested that the platform was a "toxic" place for many youngsters. According to WSJ 32% of teenage girls surveyed said that when they felt bad about their bodies, Instagram made them feel worse.
The fact that Facebook has failed to share its own detailed studies on the harm its platforms cause will surely US politicians plenty of information to consider when they deal with calls to regulate Facebook and some other social media platforms.
Facebook Oversight Board: BBC: Wall Street Journal: Wall Street Journal: You Tube:
You Might Also Read: