News

Twitter Tries To Dismiss Another Child Porn Case Citing Section 230: Even If Allegations ‘True,’ We’re Not ‘Liable’

   DailyWire.com
NEW DELHI, INDIA - NOVEMBER 12: Twitter CEO and Co Founder, Jack Dorsey addresses students at the Indian Institute of Technology (IIT), on November 12, 2018 in New Delhi, India.
Amal KS/Hindustan Times via Getty Images

Twitter is trying to dismiss a second child pornography lawsuit filed against them by an underaged victim, citing outlined protections under Section 230 of the Communications Decency Act — a controversial clause that has protected Big Tech from viewpoint discrimination and, in this case, they claim, child pornography being hosting on their platforms.

In the court filing, Twitter argued that even if “all” of the minors’ “allegations” are accepted “as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts.”

Twitter was hit with a lawsuit in January alleging that a young boy who was solicited and recruited for sex trafficking, known as John Doe #1, had to endure his own sexual abuse material being promoted on Twitter, even after attempts were made to remove the content.

A second alleged victim, known as John Doe #2, later joined the federal lawsuit. “Both plaintiffs were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified by John Doe #1 and his parents,” read a press release from the National Center on Sexual Exploitation (NCOSE).

“To encourage platforms to moderate and remove offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act (‘CDA § 230’), granting online platforms like Twitter broad immunity from legal claims arising out of the failure to remove content,” a motion to dismiss argued. “Given that Twitter’s alleged liability here rests on its failure to remove content quickly enough from its platform, dismissal of the FAC with prejudice is warranted on this ground alone.”

Section 230 of the U.S. Code states:

No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.

Twitter noted that the company eventually “did remove the videos and suspend the accounts that had posted them,” adding, “That the offending content was not taken down immediately does not make Twitter liable under any applicable law.”

“Mistakes or delays, however, do not make Twitter a knowing participant in a sex trafficking venture, as Plaintiffs here have alleged,”  the court filing said. “Plaintiffs do not (and cannot) allege, as they must, that Twitter ever had any actual connection to the Perpetrators, took any part in their crimes, or benefitted from them. Thus, even accepting all of Plaintiffs’ allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts.”

On the heels of the suits, advocates are pushing a petition on Change.org to create an easier two-step reporting process on Twitter for victims of child sexual abuse material.

“The reporting process for victims of child sexual abuse material and Twitter users should be a two step process. Clear, direct, and easy for children to report,” reads the petition, which has racked up more than 13,000 signatures.

“Twitter is currently being sued by two minor survivors of child sexual exploitation. John Doe #1 and John Doe #2 were both 13 years of age in the video shared on Twitter,” the petition adds. “Their abuse was watched 167,000 views and 2,223 retweets. John Doe #1 was a minor when he started reporting the video. He provided government ID to the platform showing that he was a minor.”

Related: Twitter Moves To Dismiss Child Porn Lawsuit Citing Section 230 Immunity

Related: Second Minor Sues Twitter Over Sex Trafficking On Social Media Site

Related: ‘How Do You Sleep At Night’: Sex Trafficking Survivor Challenges Jack Dorsey On Heels Of Bombshell Lawsuit

Got a tip worth investigating?

Your information could be the missing piece to an important story. Submit your tip today and make a difference.

Submit Tip
Download Daily Wire Plus

Don't miss anything

Download our App

Stay up-to-date on the latest
news, podcasts, and more.

Download on the app storeGet it on Google Play
The Daily Wire   >  Read   >  Twitter Tries To Dismiss Another Child Porn Case Citing Section 230: Even If Allegations ‘True,’ We’re Not ‘Liable’