Share
Wire

After Allegedly Refusing to Remove Child Porn, Twitter Files to Dismiss Lawsuit, Citing Section 230 Immunity

Share

Citing Section 230 of the Communications Decency Act of 1996, Twitter filed a motion to dismiss a pending lawsuit claiming the company refused to remove child pornography from its platform.

Filed in January by a 17-year-old victim of sex trafficking identified only as John Doe, the suit alleges that the social media company was repeatedly asked to remove pictures and videos of Doe involved in sexually explicit activities.

He was 13-years-old at the time the material was recorded.

Twitter’s motion to dismiss the suit, filed on Wednesday, argues that Section 230 of the Communications Decency Act should offer the organization immunity against lawsuits pertaining to explicit content the platform failed to remove.

“Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform’s alleged ‘knowledge’ of offensive content if it chose to try to screen out that material but was unable to root out all of it,” the motion reads.

Trending:
Dershowitz: Even If DOJ Gets Trump Convicted Of Documents Crime, He's Still Eligible to Run for President

“Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act (‘CDA § 230’), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content,” Twitter continues.

“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone.”

Doe’s lawsuit, however, claims that Twitter — or at least some Twitter employees — knew about the posts and refused to take them down.

Both Doe and his mother allegedly reported the posts multiple times to no avail, with Twitter replying that it “didn’t find a violation” of the company’s policies upon reviewing the videos.

Should Section 230 provide legal protection for Twitter in this case?

It wasn’t until an agent from the Department of Homeland Security intervened that the posts were taken down on or around Jan. 30, 2020, according to the suit.

“Section 230 of the Communications Act of 1934 (47 U.S.C.230; commonly known as the ‘Communications Decency Act of 1996’) was never intended to provide legal protection to websites that unlawfully promote and facilitate prostitution and websites that facilitate traffickers in advertising the sale of unlawful sex acts with sex trafficking victims,” the suit reads.

“Websites that promote and facilitate prostitution have been reckless in allowing the sale of sex trafficking victims and have done nothing to prevent the trafficking of children and victims of force, fraud, and coercion.”

“Clarification of such section is warranted to ensure that such section does not provide such protection to such websites.”

In total, the lawsuit accuses Twitter of benefiting from sex trafficking, failing to report known child sex abuse material (CSAM), knowingly distributing child pornography and gross negligence, among other claims.

Related:
State University Prof Defends Pedophilia in Videos, Has Long Record of Endorsing 'Adult-Child Sex'

The Western Journal reached out to the social media giant regarding John Doe’s lawsuit back in January and received the following response.

“Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy,” the company said.

“Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline.”

This article appeared originally on The Western Journal.

Submit a Correction →



Tags:
, , , , , , ,
Share

Conversation