Twitter refused to take down child porn because ‘it did not violate company policy’, sued by sex trafficking victim

Twitter refuses to remove child sex content, acts only after authorities intervene/ Image Source: Outlook

Social media giant Twitter, which has been arbitrarily censoring social media users for expressing their views, has now caught up in another controversy after it allegedly refused to take down widely shared pornographic images and videos of a child sex trafficking victim. Twitter has claimed that the images and videos of the child sex trafficking victim do not violate its company policies.

According to a report in New York Post, a now 17-year-old teen ‘John Doe’, living in Florida currently, had filed a lawsuit filed accusing Twitter of allowing the video of his abuse, filmed when he was 13 years old to accumulate hundreds of thousands of views and thus profiting from it.

The federal lawsuit, filed on Wednesday by the victim and his mother in the Northern District of California, alleged that Twitter made money by circulating porn clips, which showed a 13-year-old engaged in sex acts, basically child pornography.

The teen alleged that a few years back, sex traffickers, posing as a 16-year-old female classmate had lured him for a chat on Snapchat. He said that he had allegedly exchanged nude photos with the sex traffickers before the conversation turned to blackmail. The traffickers had allegedly warned the teen about sending the images to his “parents, coach, pastor” and others if he did not share more sexually graphic photos and videos.

Acting under duress, the teen initially complied and sent videos of himself performing sex acts. The traffickers had also forced the boy to include another child in his videos, which he did, the suit said. 

Eventually, the boy blocked the traffickers, and they stopped harassing him, but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege. 

Child sex porn videos circulated on Twitter, officials do not remove content

Months later, as alleged in the lawsuit, the videos began to circulate on social media platforms. The videos and photos have been ‘reported’ to Twitter at least 3 times, first on December 25, 2019. However, the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.

Reportedly, Doe got to know about the videos last year in January 2020 after his classmates teased, harassed and carried out vicious bullying that led him to become “suicidal”.  

The boy’s parents contacted the school and made police reports and filed a complaint with Twitter, saying there were two tweets depicting child pornography and demanded that they be removed because they were illegal, harmful, and violate the site’s policies. 

However, the micro-blogging site did not respond for a week, the family said. Around the same time, the victim’s mother also filed two complaints to Twitter, and she also received no response, the suit states. 

On January 28, Twitter finally replied to the victim and said they would not be taking down the material, which had already been watched by more than 167,000 users and retweeted over 2,223 times, the suit states. The micro-blogging site said that they did not find a violation of the company’s ‘policies’ after their investigation, the lawsuit alleges.

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” Twitter responded to the victim.

Twitter further wrote, “If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

Twitter removes content after intervention from Homeland security officials

The careless response from Twitter officials has shocked the minor victim, who has now shot another letter to the social media giant asking them to take down the appalling videos. 

“What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL, and they need to be taken down,” the teen wrote back to Twitter. 

The victim has even cited his case from a local law enforcement agency, but still, the tech-giant ignored him and did not initiate any action against the illegal child sexual abuse material even as it continued to get shared on its platform.

The victim’s family later filed a complaint with an agent from the Department of Homeland Security, who then forced the micro-blogging site to remove the content, the suit states. 

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” stated the suit, filed by the National Center on Sexual Exploitation on behalf of the victim.

The suit said that the response from Twitter is directly in contrast to what their automated reply message and User Agreement claims to say what they will do to protect children.” 

The disturbing lawsuit exposes how Twitter knowingly hosts sexual predators on its platform and allows illegal child porn networks to flourish, using the platform to exchange child porn material and profits. 

Meanwhile, Twitter, which has been at the receiving end of the massive criticism for censuring social media users for expressing their views, has replied to a query sent by New York Post saying, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy”.

“Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline,” it claimed.

OpIndia Staff: Staff reporter at OpIndia