Twitter watchdogs are calling on Elon Musk to ban pornography on the platform until it can implement an age and consent verification system for those depicted in the images and videos amid alarming concerns over the internet’s child sexual exploitation epidemic.
Officials from the National Center on Sexual Exploitation (NCOSE) told The Daily Wire that since Musk laid off most of Twitter’s Trust and Safety Team and further disbanded its Advisory Council, the alleged skeleton staff could lead the platform to even more dangerous grounds to confront its child sexual abuse material (CSAM) problem.
“Twitter allows pornography, one of the few social media platforms to do so, which enables child sexual abuse material, sex trafficking, image-based sexual abuse to thrive,” Lina Nealon, director of corporate and strategic initiatives for the center, told The Daily Wire in an email.
Rampant Internet porn has been a long-standing issue. In many cases, it involves victims of all ages depicted in such content without their consent, which further causes irreparable harm to their lives.
The New York Times published an investigation in 2019 that showed over the course of a decade, the proliferation of child sexual abuse material on the Internet increased from 600,000 images reported to NCMEC in 2008 to 60 million in 2018 — some just 3 or 4 years old that were sexually abused and, in some cases, tortured.
Nealon added, “because Twitter requires no meaningful age or consent verification of those posting or selling pornography on its platform, it’s impossible to know if the sexually graphic images on the platform are consensual or of adults, not children.”
The center added Twitter to its 2022 Dirty Dozen List of mainstream contributors to sexual exploitation, which has prompted it to partner with The Haba Law Firm to represent two survivors of child sexual abuse who were trafficked and exploited on Twitter in a lawsuit against the company.
Last week, CNN reported the outlet obtained an email sent to Twitter’s Trust and Safety Council, which included about 100 organizations that advised the platform on issues including online safety, human and digital rights, suicide prevention, mental health, child sexual exploitation, and dehumanization.
The email reportedly said the company decided to dissolve the advisory council to reevaluate “how best to bring external insights into our product and policy development work,” which the platform deemed the council was “not the best structure to do this.”
The Associated Press further reported the email, which was signed by “Twitter,” said the company would be “moving faster and more aggressively than ever before” to “make Twitter a safe, informative place.”
Twitter staff also said the former council members could still contribute ideas in the future to achieve their goal.
A spokesperson from the National Center for Missing & Exploited Children (NCMEC), which the U.S. government tasked to track reports of child sexual abuse material online and served on the Twitter Advisory Council group, told NBC News that little has changed under Musk’s leadership referring to the organization’s centralized CSAM reporting system.
“Despite the rhetoric and some of what we’ve seen people posting online, their CyberTipline numbers are almost identical to what they were prior to Musk coming on board,” Gavin Portnoy, an NCMEC representative, told the outlet.
Last year, 86,666 CSAM reports on Twitter alone were made to NCMEC, which plays a key role after Internet sites — like Twitter — remove the material. When child sexual exploitation material is detected and removed, the report goes to the center, which contacts law enforcement for further action.
In 2021, NCMEC’s CyberTipline received 29.3 million reports of suspected child sexual exploitation material, which estimates to be an increase of 35% from 2020.
Musk, who has made sweeping changes within his newly purchased $44 billion company, recently said he would make addressing its alleged child sexual exploitation problem the platform’s number one priority.
Ella Irwin, the platform’s new head of Trust & Safety, said on Twitter earlier this month that when she joined the company last June she saw “zero engineers and very few employees” working on child sexual abuse material at one point.
“I was shocked to find that there were such gaping holes in some of these really critical areas like, for example, child safety,” Irwin said during a December 10 Twitter Space.
Irwin claimed the company had slashed its child safety team “a long time ago” and that her team would actually add more people to solve the problem under the new leadership.
“We’ve already started, actually, but we’re going to be adding people to the team and increasing the number of people,” Irwin said. “We will have more people working on child safety than we’ve ever had before — at least this year that I can speak to that I’ve seen.”
Andrea Stroppa, founder of independent cybersecurity group Ghost Data, has been working alongside top officials at Twitter and found some accounts posting content that allegedly sexually exploited children garnered more than ten million views on the platform under previous ownership. Other reports from Stroppa led approximately 30 major advertisers to pull or pause their ad services on their Twitter accounts after more than 500 accounts were soliciting child sexual exploitation material near their brands.
Stroppa recently said former Twitter officials would only remove a tweet from accounts that published child sexual abuse material on the platform instead of completely banning the user. The company has since changed that policy.
Nealon of NCOSE said while the center is encouraged by Musk’s concern about child sexual exploitation on Twitter, his plans remain unclear to ensure child safety and mitigate other harms on the platform.
“Musk needs to build up and dedicate substantive resources to the Trust & Safety Team, rebuild the Trust and Safety Advisory Council, cease plans for Paywalled Video, remove the Pornhub account from Twitter, and prohibit pornography if and until the platform can institute substantive age and consent verification for all those depicted in content, otherwise Musk will risk the high likelihood that CSAM, sex trafficking, and Image-Based Sexual Abuse will continue to thrive – and even expand – on Twitter,” said Nealon.