In the lawsuit filed in early 2021, the Plaintiffs, John Doe #1 and John Doe #2, allege that they were solicited and recruited for sex trafficking at the age of 13. Later, child sexual abuse material depicting them was disseminated on Twitter while they were still minors. Both plaintiffs were harmed by Twitter’s distribution of the material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified by John Doe #1 and his parents.
When Twitter was first alerted to harmful and illegal material and the ages of the children, Twitter refused to remove it and instead continued to promote and profit from the sexual abuse of the children. Twitter even reported back to one survivor that the video in question did not in fact violate any of its policies and would not be taken down. This refusal resulted in the child sexual abuse material accumulating over 167,000 views before direct involvement from a federal law enforcement officer finally induced Twitter to remove the child sex abuse material.
Twitter is one of the most prolific distributors of sexual exploitation material—including material soliciting, advancing, and depicting human trafficking and material depicting the sexual abuse of children.
Twitter is not a passive, inactive agent in the distribution of this harmful material. Twitter’s own practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.
A measure of justice for survivors and Twitter held legally accountable to the reality that it can and must make a remarkable difference and promote significant steps forward in the fight against sexual abuse and exploitation online.
THE BIG PICTURE
With over 330 million users, Twitter is one of the largest social media companies in the world. It is also one of the most prolific distributors of sexual exploitation material—including material soliciting, advancing, advertising the sale of people for sex acts through prostitution and sex trafficking, and material depicting the sexual abuse of children.
Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material. Twitter's own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.
Spread the Word
on how Twitter is complicit with the distribution of child sexual abuse material (CSAM)
Spread the Word
on how Twitter is complicit with the distribution of CSAM