News

10 Million Views Of Child Sexual Exploitation Material Allegedly Watched On ‘Old Twitter’: Report

   DailyWire.com
David Paul Morris/Bloomberg via Getty Images

An independent cybersecurity data analyst working alongside top officials at Twitter found accounts posting content that allegedly sexually exploited children garnered more than ten million views on the platform under previous ownership.

Andrea Stroppa, founder of cybersecurity group Ghost Data, personally funded research last summer that resulted in the eradication of more than 500 accounts soliciting Child Sexual Abuse Material (CSAM), which led approximately 30 major advertisers to pull or pause their ad services on their Twitter accounts.

A new report from the data analyst released Friday found that more than 95% of several active accounts exploiting CSAM, which allegedly included videos of children and teens involved in sexual activities, “acted with impunity for years.”

“The more we work on child sexual abuse material on Twitter, the more we find the HELL left by the old Twitter management,” Stroppa wrote in a Twitter thread.

Stroppa said those accounts have since been taken down.

Twitter’s alleged CSAM problem has been documented for more than a decade.

Last summer, Twitter claimed that the company aggressively fights online child sexual abuse; however, a Twitter Red Team report from an Adult Content Monetization project found that although the platform invested in technologies to detect and manage the issue, “growth has not.”

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team concluded in April 2022, according to The Verge.

Last year, 86,666 CSAM reports on Twitter were made to the National Center for Missing and Exploited Children, which plays a key role after Internet sites — like Twitter — remove the material. When CSAM is detected and removed, the report goes to the center, which contacts law enforcement for further action.

Twitter also faces two significant lawsuits involving two minor survivors allegedly exploited on that platform. In one case, a video garnered more than 160,000 views and over 2,000 retweets. Yet, Twitter allegedly refused to remove the content, despite the minor survivor requesting its removal while on the brink of suicide.

“I found that old Twitter, in many cases related to child sexual abuse material published on the platform by malicious users, instead of suspending the whole account, just removed the tweet,” Stroppa said. “Now Twitter, under Elon Musk’s leadership, directly suspends the accounts.”

For context, one previous report showed the platform removed as many as 57,000 profiles in one month before Musk’s acquisition compared to the 44,000 removed in a single day.

Twitter itself reported last week the company suspended 57% of accounts who create, distribute, or engage with child sexual exploitation material, which the platform said is “significantly more than any other month YTD.”

Musk and former CEO Jack Dorsey interacted publicly on the platform late last week over the reported lack of action that the company took against removing material from the site.

Dorsey responded to Musk, claiming the allegations were false.

Musk’s new head of Trust & Safety, Ella Irwin, responded to Dorsey, “I wish this was false but my experience this year supports [what Musk said].” She then claimed that she had tried to get funding to replace employees who had left earlier this year “and was told no.”

Dorsey said he did not know “what happened in [the] past year.”

“But to say we didn’t take action for years isn’t true,” he said.

Eliza Bleu, a survivor advocate of those affected by human trafficking, has been speaking out against CSAM on Twitter since shortly after joining the platform in 2019.

On Friday, Bleu co-hosted a Twitter space with crypto stock trader Tara Bull featuring a group interview of Musk, Irwin, and Stroppa.

Irwin said she “was shocked to find that there were such gaping holes in some of these really critical areas like, for example, child safety.”

Irwin, who joined the company in June, said she could only speak to the data set for this year.

“I can’t speak to the last five years at Twitter,” Irwin said. “So certainly, multiple statements could be true here. But what I would say is that when I joined Twitter, I was actually fairly shocked at just how much attrition had happened and without any replacement.”

Legacy media reports, however, say otherwise, claiming Twitter slashed its child safety team after Musk took over.

“The reality is when you refuse to staff things that are this important, and you’re still staffing other things — like a product launch that may not be as critical — and meanwhile you’ve got all this going on, or some other program that is nowhere near as critical as some of these issues, the reality is you’ve lost sight of the priorities,” Irwin said. “The users, user protection, keeping illegal content off the platform should always be number one.”

Musk said during the space learning about the issue “shocked” him upon purchasing the platform.

“This is literally the top priority in the whole company that Twitter cannot be used for child exploitation,” Musk said, adding the issue remains the platform’s “number one priority.”

“Hopefully, lives are saved, and lives are also improved,” he added.

Bleu noted in the Twitter space her gratitude for the changes happening within Twitter after years of sounding the alarm about the exploitation of children on the platform.

“I understand the stakes,” Bleu said. “This is life or death for minor survivors around the globe — I do not take this issue lightly.”

Got a tip worth investigating?

Your information could be the missing piece to an important story. Submit your tip today and make a difference.

Submit Tip
Download Daily Wire Plus

Don't miss anything

Download our App

Stay up-to-date on the latest
news, podcasts, and more.

Download on the app storeGet it on Google Play
The Daily Wire   >  Read   >  10 Million Views Of Child Sexual Exploitation Material Allegedly Watched On ‘Old Twitter’: Report