BACK
On Musk's Twitter, users looking to sell and trade child sex abuse material are still easily found
www.nbcnews.com

On Musk's Twitter, users looking to sell and trade child sex abuse material are still easily found

Elon Musk has said he wants to make cracking down on child exploitation his number one priority at Twitter. A review of tweets suggests there's a lot of work to do.

Culture & Entertainment

Twitter accounts that offer to trade or sell child sexual abuse material under thinly veiled terms and hashtags have remained online for months, even after CEO Elon Musk said he would combat child exploitation on the platform.

“Priority #1,” Musk called it in a Nov. 20 tweet. He’s also criticized Twitter’s former leadership, claiming that they did little to address child sexual exploitation, and that he intended to change things.

But since that declaration, at least dozens of accounts have continued to post hundreds of tweets in aggregate using terms, abbreviations and hashtags indicating the sale of what Twitter calls child sexual exploitation material, according to a count of just a single day’s tweets. The signs and signals are well known among experts and law enforcement agencies that work to stop the spread of such material.

Click to continue reading

The tweets reviewed by NBC News offer to sell or trade content that is commonly known as child pornography or child sexual abuse material (CSAM). The tweets do not show CSAM, and NBC News did not view any CSAM in the course of reporting this article.

Some tweets and accounts have been up for months and predate Musk’s takeover. They remained live on the platform as of Friday morning.

Many more tweets reviewed by NBC News over a period of weeks were published during Musk’s tenure. Some users tweeting CSAM offers appeared to delete the tweets shortly after posting them, seemingly to avoid detection, and later posted similar offers from the same accounts. Some accounts offering CSAM said that their older accounts had been shut down by Twitter, but that they were able to create new ones.

According to Twitter’s rules published in October 2020, “Twitter has zero tolerance towards any material that features or promotes child sexual exploitation, one of the most serious violations of the Twitter Rules. This may include media, text, illustrated, or computer-generated images.”

In an email to NBC News after this article was published, Ella Irwin, Twitter’s vice president of product overseeing trust and safety, said “We definitely know we still have work to do in the space, and certainly believe we have been improving rapidly and detecting far more than Twitter has detected in a long time but we are deploying a number of things to continue to improve.” Irwin asked that NBC News provide the findings of its investigation to the company so that it could "follow up and get the content down."

It’s unclear just how many people remain at Twitter to address CSAM after Musk enacted several rounds of layoffs and issued an ultimatum that led to a wave of resignations. Musk has engaged some outside help, and the company said in December that its suspension of accounts for child sexual exploitation had risen sharply. A representative for the U.S. child exploitation watchdog the National Center for Missing and Exploited Children said that the number of reports of CSAM detected and flagged by the company remains unchanged since Musk’s takeover.

Twitter also disbanded the company’s Trust and Safety council, which included nonprofits focused on addressing CSAM.

Twitter’s annual report to the Securities and Exchange Commission said the company employed more than 7,500 people at the end of 2021. According to internal records obtained by NBC News, Twitter’s overall headcount had dwindled to around 1,340 active employees as of early January, with around 20 people working in the company’s Trust & Safety organization. That is less than half of the previous Trust and Safety workforce.

One former employee who worked on child safety issues, a specialization that fell under a larger Trust and Safety group, said that many product managers and engineers who were on the team that enforced anti-CSAM rules and related violations before Musk’s purchase had left the company. The employee asked to remain anonymous because they had signed a nondisclosure agreement. It’s not known precisely how many people Musk has assigned to those tasks now.

Since Musk took over the platform, Twitter cut the number of engineers at the company in half, according to internal records and people familiar with the situation.

Irwin said in her email that "many employees who were on the child safety team last year are no longer part of the company but that primarily happened between January and August of last year due to rapid attrition Twitter was experiencing across the company." Additionally, she said that the company has "roughly 25% more staffing on this issue/ problem space now than the company had at its peak last January.

CSAM has been a perpetual problem for social media platforms. And while some technology has been developed to automate the detection and removal of CSAM and related content, the problem remains one that needs human intervention as it develops and changes, according to Victoria Baines, an expert on child exploitation crimes who has worked with the U.K.’s National Crime Agency, Europol, the European Cybercrime Centre and Facebook.

“If you lay off most of the trust and safety staff, the humans that understand this stuff, and you trust entirely to algorithms and automated detection and reporting means, you’re only going to be scratching the surface of the CSAM phenomenon on Twitter,” Baines said. “We really, really need those humans to pick up the signals of what doesn’t look and sound quite right.”

The accounts seen by NBC News promoting the sale of CSAM follow a known pattern. NBC News found tweets posted as far back as October promoting the trade of CSAM that are still live — seemingly not detected by Twitter — and hashtags that have become rallying points for users to provide information on how to connect on other internet platforms to trade, buy and sell the exploitative material.

In the tweets seen by NBC News, users claiming to sell CSAM were able to avoid moderation with thinly veiled terms, hashtags and codes that can easily be deciphered.

Some of the tweets are brazen and their intention was clearly identifiable (NBC News is not publishing details about those tweets and hashtags so as not to further amplify their reach). While the common abbreviation “CP,” a ubiquitous shortening of “child porn” used widely online, is unsearchable on Twitter, one user who had posted 20 tweets promoting their materials used another searchable hashtag and wrote “Selling all CP collection,” in a tweet published on Dec. 28. The tweet remained up for a week until the account appeared to be suspended following NBC News’ outreach to Twitter. A search Friday found similar tweets still remaining on the platform. Others used keywords associated with children, replacing certain letters with punctuation marks like asterisks, instructing users to direct message their accounts. Some accounts even included prices in the account bios and tweets.

None of the accounts reviewed by NBC News posted explicit or nude photos or videos of abuse to Twitter, but some posted clothed or semi-clothed images of young people alongside messages offering to sell “leaked” or “baited” images.