X To Open a New Content Moderation Center in Austin
X (formerly Twitter) plans to open a new content moderation center in Austin, Texas. The “Trust and Safety center of excellence” will primarily focus on stopping child sexual exploitation (CSE) materials on the platform.
Aside from CSE materials, the new safety center will also address other issues like hate speech and posts promoting violence.
The company’s head of business operations, Joe Benarroch, revealed X’s plans to hire 100 full-time employees at the new location but did not share any information on when the center will start operating.
Speaking to Bloomberg, Benarroch said, “X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content.”
The announcement comes just days ahead of X CEO Linda Yaccarino’s appearance before the Senate Judiciary Committee, along with CEOs from other major tech companies including Meta, TikTok, and Snapchat.
X has attracted widespread criticism regarding its safety efforts under Musk’s leadership. The business mogul took over the company in 2022 when he laid off 80% of its employees, including the majority of Twitter’s content moderators.
Since acquiring Twitter, Musk, who describes himself as a “free speech absolutist,” has taken steps to turn Twitter into a “free speech bastion.” He went on to reinstate previously suspended accounts of controversial figures. He also announced that users would no longer be able to block other users, along with pushing back certain policies related to misinformation.
X came under fire during the Hamas-Israel conflict for Musk’s antisemitic tweet and accusations that the platform has been fertile soil for misinformation and hate speech related to the war. At that time, the EU launched a formal investigation into the company’s content moderation practices.
Previously, the Center for Countering Digital Hate (CCDH), an online hate speech watchdog, reported that out of 100 hate speech posts on the platform, Twitter failed to take action against 99% of them. Musk responded with a legal action against the nonprofit, accusing it of fabricating faulty results.
While the decision to establish a new content moderation team of 100 employees seems like a move in the right direction for X, it’s worth noting that before Musk, the platform had around 1,500 employees in charge of tracking abuse and enforcing misinformation policies.