Friday, April 19, 2024
HomePoliticsBig Tech companies reveal trust and safety cuts in disclosures to Senate...

Big Tech companies reveal trust and safety cuts in disclosures to Senate Judiciary Committee



In new disclosures to the Senate Judiciary Committee, Big Tech companies revealed the details around deep cuts made to trust and safety departments across the industry in recent years.

The CEOs of X, Snap and Discord all revealed cuts to their trust and safety teams — units at tech companies tasked with monitoring and moderating safety risks on the platforms — according to written responses from tech CEOs published by the Senate Judiciary Committee on Monday. 

Meta and TikTok did not provide historical information to the committee about their trust and safety staffing despite requests to do so, but previous reporting indicates that Meta has also made cuts to those teams’ staff during the same time period. TikTok also reportedly conducted layoffs this year.

The numbers, some of which have not previously been disclosed, come on the heels of the January hearing where senators grilled social media executives about child safety issues. Despite the CEOs’ reassurances that the companies were laser-focused on platform safety issues, the numbers show that the majority of the platforms have chosen to scale back their trust and safety operations — even under the dangling sword of potential regulation and ahead of the 2024 presidential election. 

In response to a question from Sen. Cory Booker, D-N.J., asking for the number of trust and safety personnel employed by the company over the past five years, X provided data for three years. The numbers showed a consistent decline in trust and safety personnel at the company over the last two years.

“X had 3317 Trust and Safety employees and contractors in May 2022, and 2849 in May 2023,” X replied. “Today, we have approximately 2300 people working on Trust and Safety matters and are building a Trust and Safety Center of Excellence in Austin, Texas, in an effort to bring more agent capacity in-house and rely less on outside contractors.” 

According to Australia’s eSafety Commissioner, the country’s independent regulator for online safety, X had previously disclosed that the company had 4,062 trust and safety contractors the day before Elon Musk acquired it. Based on the answer to Booker, the company has cut 43% of its trust and safety roles under Musk. The Australian commissioner noted that X said the number of full-time content moderators fell from 107 to 51.

X did not disclose to senators how much money it was putting toward its trust and safety efforts, but it said the company has set a goal to hire 100 employees for that team.

X has been criticized for loosening certain types of moderation under Musk’s ownership. Musk has said he believes most types of content should be allowed on the platform as long as it doesn’t violate laws. 

Snap was more detailed and forthcoming in its disclosures.

The company provided employee data between 2019 and 2023. In 2019, the company had 763 workers, including full-time employees and contract workers, doing safety and moderation tasks. By 2021, that number had increased to 3,051. But in 2022 it shrunk to 2,592, and in 2023 shrunk again to 2,226. The reduction represented a 27% cut of the company’s trust and safety personnel from its peak. 

Snap disclosed that it had increased its trust and safety budget to “approximately $164 million” by 2022, but had slashed spending on trust and safety issues to $135 million in 2023. 

The company said in 2023 its global revenue from minors was approximately $437 million. 

Snap in particular has been criticized for the continued use of the platform by individuals accused of selling fentanyl to some minors who died from taking the drug. Parents of deceased children have been particularly vocal in advocating for tech regulation.

In a statement to NBC News, Snap spokesperson Rachel Racusen said: “We continue to invest in growing our safety and content moderation teams. Since 2020, our safety team more than doubled, and our moderation team more than tripled.” 

Discord said in the disclosures that it had in recent years increased its number of trust and safety employees until it made cuts in 2024. Discord said it had 22 full-time employees devoted to trust and safety in 2019. By 2023, Discord said that team had grown to 90 employees. In 2024, after the company went through a round of layoffs, the team shrunk to a pre-2021 size of 74 full-time employees. 

Discord said that in addition to its full-time workforce, it has consistently grown its contract workforce, which it says now includes “400 additional contract agents including external, virtual Special Operations Center, other support functions.”

Meta did not respond to written questions about trust and safety staffing, saying that it would provide more answers on a rolling basis. In CEO Mark Zuckerberg’s testimony at the January hearing, he said Meta had approximately 40,000 people “working on safety and security” — a figure the company also cited in February 2023 to NBC News in response to reports about layoffs at one of Meta’s main content moderation contractors in Africa. 

In a statement, a Senate Judiciary Committee representative said: “The Committee granted Mark Zuckerberg multiple extensions to respond to Questions for the Record. Yet, six weeks after receiving the QFRs — and four weeks after the initial deadline to respond — Mr. Zuckerberg answered only a small fraction of members’ questions.  QFRs are a critical information-gathering tool for Congressional hearings, which Mr. Zuckerberg knows well after multiple appearances before Congress. His lack of urgency in responding to members’ questions proves yet again that neither he nor his company is committed to protecting children online.  It is more important than ever for the Senate to pass our kids’ online safety bills and finally hold Big Tech accountable.” 

The representative did not immediately respond to a request for additional information about the committee’s requests to Zuckerberg.

Meta spokesperson Andy Stone told NBC News, “We are working diligently to answer the over 500 questions for the record we received after the Senate Judiciary Committee hearing.”

TikTok did not provide historical data, but told senators that it employed 40,000 people working on trust and safety issues. TikTok has not published how many people total currently work at the company.



Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments