Facebook Failed - Can Meta Help It Recover?
Facebook is rebranding itself with creation of a holding company named Meta. The company says that this will allow it to better "encompass" everything it does, as it broadens its reach beyond social media into Virtual Reality (VR). The change does not apply to its individual platforms, such as Facebook, Instagram and WhatsApp, only the parent company that owns them.
The move follows a series of negative stories about Facebook, based on documents leaked by an ex-employee, Frances Haugen, who told the both the US congress and British Parilament that Facebook's platforms "harm children, stoke division and harm our democracy".
Internal documents released by Haugen show that the company has been routinely placing public-relations, profit, and regulatory concerns over user welfare. The documents were part of legally required disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form.
The extend of Facebook's problems began to become evident in 2019, when the BBC revealed the full scope of the problem: a broad network that illegally trafficked domestic workers, facilitated by internet platforms and aided by algorithmically boosted hashtags. Some of the trade was carried out on Facebook-owned Instagram, where posts were being promoted via algorithm-boosted hashtags, and sales negotiated via private messages.
Facebook launched a big effort to combat the use of its platforms for human trafficking. Working around the clock, its employees searched Facebook and its subsidiary Instagram for keywords and hashtags that promoted domestic slavery in the Middle East and elsewhere. A group of Facebook researchers focused on the Middle East and North Africa found numerous Instagram profiles being used as advertisements for trafficked domestic servants as early as March 2018.
Despite this effort Apple threatened to remove Facebook and Instagram from its App Store in October 2019 because of the BBC report. Motivated by what employees describe in an internal document as “potentially severe consequences to the business” that would result from an App Store ban, Facebook moved fast to take down 129,191 pieces of content, disabled more than 1,000 accounts and tightened its policies to detect this kind of behavior.
The documents highlighted by Haugen makes it clear that the decision to act was not the result of new information: “Was this issue known to Facebook before BBC inquiry and Apple escalation? Yes.” Another former Facebook employee has told US authorities the company's efforts to remove child abuse material from the platform were "inadequate" and "under-resourced".
Facebook had taken action, but the damage was done as the extent of Facebook's problems started to come into focus.
Over the course of 2020 more and more information came to light in the US about how Facebook has spread of misinformation, hate speech and political polarisation. Key events included algorithmic propagation of conspiracy theories about Coronavirus vaccines and Facebook's instrumental role in the ability of an extremist mob to attack the US Capitol.
According to the Haugen's documents, Facebook was aware that its products are being used to facilitate hate speech in the Middle East, violent cartels in Mexico, ethnic cleansing in Ethiopia, extremist anti-Muslim rhetoric in India, and sex trafficking in Dubai. It is also aware that its efforts to combat these things are insufficient. An internal report dated March 2021 report reads, “We frequently observe highly coordinated, intentional activity … by problematic actors” that is “particularly prevalent, and problematic, in At-Risk Countries and Contexts”; the report later acknowledges, “Current mitigation strategies are not enough.”
- In some cases, employees have successfully taken steps to address these problems, but in many others, the company response has been slow and incomplete. As recently as late 2020, an internal Facebook report found that only 6 percent of Arabic-language hate content on Instagram was detected by Facebook’s systems.
- Another report that circulated last winter found that, of material posted in Afghanistan that was classified as hate speech within a 30-day range, only 0.23 percent was taken down automatically by Facebook’s tools. In both instances, employees blamed company leadership for insufficient investment.
In many of the world’s most fragile nations, a company worth hundreds of billions of dollars hasn’t invested enough in the language- and dialect-specific artificial intelligence and staffing it needs to address these problems. Indeed, last year, according to the documents, only 13 percent of Facebook’s misinformation-moderation staff hours were devoted to the non-US countries in which it operates, whose populations comprise more than 90 percent of Facebook’s users.
Although Facebook users post in at least 160 languages, the company has built robust AI detection in only a fraction of those languages, the ones spoken in large, high-profile markets such as the US and Europe, a choice, the documents reveal, that means problematic content is seldom detected.
In February, just over a year after Facebook’s sweep for Middle Eastern and North African domestic-servant trafficking, another internal report identified a web of similar activity, in which women were being trafficked from the Philippines to the Persian Gulf, where they were locked in their homes, denied pay, starved, and abused. This report found that content “should have been detected” for violating Facebook’s policies but had not been, because the mechanism that would have detected much of it had recently been made inactive. The title of the memo is “Domestic Servitude: This Shouldn’t Happen on FB and How We Can Fix It.”
What happened in the Philippines, and in Honduras, and Azerbaijan, and India, and Bolivia, wasn’t just that a very large company lacked a handle on the content posted to its platform. It was that, in many cases, a very large company knew what was happening and failed to meaningfully intervene.
That Facebook has repeatedly prioritised solving problems for Facebook over solving problems for users should not be surprising. Facebook is doing what large public companies do and is acting in its own self-interest to preserve shareholder value. The corporate rebranding is clearly in response the these events and the growing t threat of regulation, but a new focus on VR will not fix the problems that Meta has with its social media platforms.
Reuters: DefenseOne: The Atlantic: BBC : BBC: BBC:
You Might Also Read:
Leaked Facebook Documents Reveal Discriminatory Practices: