X’s First Transparency Report Since Elon Musk’s Takeover Is Finally Here

Today, X released the company’s first transparency report since Elon Musk bought the company, formerly Twitter, in 2022.

Before Musk’s takeover, Twitter would release transparency reports every six months.These largely covered the same ground as the new X report, giving specific numbers for takedowns, government requests for information, and content removals, as well as data about which content was reported and, in some cases, removed for violating policies. The last transparency report available from Twitter covered the second half of 2021 and was 50 pages long. (X’s is a shorter 15 pages, but requests from governments are also listed elsewhere on the company’s website and have been consistently updated to remain in compliance with various government orders.)

Comparing the 2021 report to the current X transparency report is a bit difficult, as the way the company measures different things has changed. For instance, in 2021, 11.6 million accounts were reported. Of this 11.6 million, 4.3 million were “actioned” and 1.3 million were suspended. According to the new X report, there were over 224 million reports, of both accounts and pieces of individual content, but the result was 5.2 million accounts being suspended.

While some numbers remain seemingly consistent across the reports—reports of abuse and harassment are, somewhat predictably, high—in other areas, there’s a stark difference. For instance, in the 2021 report, accounts reported for hateful content accounted for nearly half of all reports, and 1 million of the 4.3 million accounts actioned. (The reports used to be interactive on the website; the current PDF no longer allows users to flip through the data for more granular breakdowns.) In the new X report, the company says it has taken action on only 2,361 accounts for posting hateful content.

But this may be due to the fact that X’s policies have changed since it was Twitter, which Theodora Skeadas, a former member of Twitter’s public policy team who helped put together its Moderation Research Consortium, says might change the way the numbers look in a transparency report. For instance, last year the company changed its policies on hate speech, which previously covered misgendering and deadnaming, and rolled back its rules around Covid-19 misinformation in November of 2022.

“As certain policies have been modified, some content is no longer violative. So if you’re looking at changes in the quality of experience, that might be hard to capture in a transparency report,” she says.

X has also lost users since Musk’s takeover, further complicating what the new reality of the platform might look like. “If you account for changing usage, is it a lower number?” she asks.

After taking over the company in October of 2022, Musk fired the majority of the company’s trust and safety staff as well as its policy staff, the people who make the platform’s rules and ensure they’re enforced. Under Musk, the company also began charging for its API, making it harder for researchers and nonprofits to access X data to see what was really going on on the platform. This may also account for changes between the two reports.

“They might have enforced a certain amount of content. But if a capacity has changed, the numbers might be understating the severity of impact because of reduced capacity for manual review,” says Skeadas. And while the report indicates that many of the takedowns are algorithmic, she notes that with fewer staff, “automated systems might not be audited as regularly as they should.” This, says Skeadas, is particularly important for “human rights defenders, journalists, women, protected demographics, race, ethnic, religious, minority groups. Those are the cases that generally receive special attention by public policy teams and teams that were ensuring safe spaces on the platform.”

Part of Musk’s philosophy for the company was one of “free speech absolutism,” reinstating accounts that had been banned for spreading misinformation and hate speech. Over the summer, Musk refused to remove accounts that the Brazilian government said had spread misinformation about the security of the country’s elections, causing the platform to be suspended in Brazil for several weeks. Interestingly, Turkey was the country with the most content removal requests, with nearly 10,000 for the first part of 2024, 60 percent of which X complied with. Brazil did not make the top five list of countries with removal requests.

Earlier this month, with two months to go until the US elections, X posted a handful of trust and safety job openings, but it’s likely that the team is still much smaller than it was at Twitter.

“Transparency is at the core of what we do at X,” says company spokesperson Michael Abboud. “As an entirely new company, we took time to rethink how best to transparently share data related to the enforcement of the policies that keep our community safe. Now, on the heels of the immense progress we have made, we are excited to share the work we do each and every day.”

When asked if the transparency reports are part of complying with the European Union’s Digital Services Act, Abboud tells WIRED that it is not. “This is,” he says, “part of our desire to be extremely transparent.”

Facebook
Twitter
LinkedIn
Telegram
Tumblr