FILE PHOTO: Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill, in Washington, U.S., October 5, 2021. Matt McClain/Pool via REUTERS/File Photo
October 25, 2021
LONDON (Reuters) – Facebook will fuel more episodes of violent unrest around the world because of the way its algorithms are designed to promote divisive content, whistleblower Frances Haugen told the British parliament on Monday.
Haugen, a former product manager on Facebook’s civic misinformation team, was appearing before a parliamentary select committee in Britain that is examining plans to regulate social media companies.
She said the social network saw safety as a cost centre, lionised a start-up culture where cutting corners was a good thing and said it was “unquestionably” making hate worse.
“The events we’re seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it,” she said.
She said the algorithms pushed users towards the extreme. “So someone centre left, they’ll be pushed to radical left, someone centre right will be pushed to radical right,” she said.
Facebook CEO Mark Zuckerberg has hit back against Haugen’s accusations, saying earlier this month: “The argument that we deliberately push content that makes people angry for profit is deeply illogical.”
Reuters, along with other news organisations, viewed documents released to the U.S. Securities and Exchange Commission and Congress by Haugen.
They showed Facebook had known that it hadn’t hired enough workers who possessed both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries.
(Reporting by Kate Holton and Paul Sandle; Editing by Keith Weir)