X Faces Negligence Claim Over Failure to Report Child Abuse Images

X Faces Negligence Claim Over Failure to Report Child Abuse Images

A federal appeals court has ruled that X, formerly known as Twitter, must contend with a negligence claim regarding its failure to promptly report explicit images of two underage boys to the National Center for Missing and Exploited Children (NCMEC). The lawsuit, initiated by plaintiffs John Doe 1 and John Doe 2, accuses the platform of negligence after it took nine days to remove the content and alert authorities.

The specific events that led to the lawsuit occurred after John Doe 1, just 13 years old at the time, and John Doe 2 were coerced into sending nude images through Snapchat. They were used and abused to justify the deeds done. Authorities later revealed that the man in question was a trafficker of child porn. He then blackmailed both boys, compelling them to produce even more of their own smutty material. Before any enforcement action was taken against the uploaded video, it received more than 167,000 views.

As X’s lawyers did not respond immediately when contacted for comment, we cannot know how they plan to respond to the case’s legal implications, which are profound. In December 2023, a trial judge threw out the case. Now, a federal appeals court has brought one prong of the lawsuit back to life with its surprise ruling last week. The court’s decision indicates that Section 230 of the federal Communications Decency Act does not provide X with immunity regarding this negligence claim.

Lawyers for the National Center on Sexual Exploitation, which is bringing the case on behalf of the plaintiffs, said they were hopeful about taking the case further.

“We look forward to discovery and ultimately trial against X to get justice and accountability.” – Dani Pinter

What’s even more striking about the sequence of events is that the claims all occurred before Elon Musk took over Twitter in 2022. In fact, it bears emphasizing that Musk is not a named defendant in this action.

As the case forward, it underscores the critical role that social media platforms can have. Specifically, they have to proactively monitor for illegal content and report it. The outcome could set precedents for how negligence claims against such companies are handled in future instances involving child exploitation.

Tags