Jane (not her real name) was 15 years old and lived in a small town in Indiana. Jane loved her parents and younger siblings. She was a solid student, not at the top of her class but still a good student. She was well-liked and well-rounded, and popular. One evening, she was walking home at dinnertime from a neighborhood friend’s house. Three older teenage boys stopped their car to talk with her. At knifepoint, they abducted her. When she resisted, they stabbed her. They took her to a nearby empty house. They spent the entire night raping her, taking turns. And she was beaten. Early the next morning, they took her to the edge of town. They threw her out of their car like a dead cat. A good Samaritan found her and took her to the hospital, where she was treated. All the appropriate rape tests were done. The police were called in. The older boys were caught and prosecuted. Two were minors. One was not. It took months before she was healthy enough to return to school. Physically, emotionally, and spiritually, she remained fragile.
Then her living nightmare got much worse. She noticed at school that the boys were looking at their mobile phones, acting strangely, and then looking at her. What she didn’t know then was that one of the older boys who raped her had made a video of the rape. He had uploaded the video to a website called Pornhub, where it was being viewed by thousands of visitors. It’s not just the hard-core pornography websites that are profiting from child sexual abuse.
We at the National Center on Sexual Exploitation have filed a class action lawsuit against Pornhub in federal court on behalf of the thousands of minors, like Jane and Sarah, whose images have been uploaded to Pornhub and monetized.
The numbers are staggering. A New York Times investigation called the proliferation of child sexual abuse material an “almost unfathomable” increase in criminality: from 600,000 images reported to the National Center for Missing and Exploited Children in 2008 to 60 million in 2018. Further, reports increased by 35 percent between 2020 and 2021.
How can these horrible things occur, and why aren’t law enforcement agencies prosecuting Pornhub? Here is the truly distressing part. Three years ago, there were 7 million videos uploaded to Pornhub alone. That’s 19,000 videos per day. There are hundreds, perhaps thousands, of online platforms like Pornhub on the internet. Anyone, regardless of intent or motive, can upload hardcore pornographic videos to these platforms, including rape videos and child sexual abuse material, more commonly known as child pornography. These platforms operate like criminal enterprises. They do not require that the participants shown in the videos (like Jane and Sarah) be 18 or older. They don’t seem to care. They do not require the consent of the person shown in the video. This isn’t even an after-thought, as they often refuse to delete the videos from their sites when requested to do so by persons who never consented to the uploading in the first place or who were minors when the images were made. Though they make enormous sums of money from viewers and advertisers in the United States, most of these sex trafficking websites are registered in foreign countries and are hard to sue.
But it’s not just the hard-core pornography websites that are profiting from child sexual abuse and the trafficking of its victims. Platforms such as Twitter, Reddit, TikTok, Snapchat, and others also have child sexual abuse material on their sites and monetize it. Indeed, all of these named online platforms are defendants in lawsuits alleging their culpability in violating sex trafficking and child sexual abuse laws.
We represent two boys who were groomed by an apparent pedophile online when they were both 13. Their child sexual abuse images were uploaded to Twitter, where they were viewed, tweeted, and retweeted thousands of times. There were hundreds of downloads. The boys and their families contacted Twitter and demanded that the images be removed. Twitter responded by stating, in writing, that it had viewed the images; they did not violate Twitter policies and would not be removed. It was only with the help of an official from the Department of Homeland Security, who contacted Twitter directly, that Twitter deleted the sexually explicit images of the boys from its site. But the damage had been done.
Why do these platforms think they can get away with this? Well, there is a federal law that went on the books in 1997, ironically named the Communications Decency Act (CDA). Rather than promoting “decency,” it did the opposite. Section 230 of that law purports to give online platforms absolute immunity from third-party content uploaded to the sites. The platforms even claim it immunizes them from liability when monetizing uploaded rape videos and child pornography. These platforms have used CDA 230 as a shield protecting them from the consequences of their bad behavior since its enactment.
In response, Congress passed two laws in 2018 called FOSTA (Fight Online Sex Trafficking Act) and SESTA (“Stop Enabling Sex Traffickers Act). The National Center on Sexual Exploitation was instrumental in passing these laws. FOSTA-SESTA essentially removes CDA 230 immunity when online platforms such as Pornhub and Twitter engage in a venture with sex traffickers or benefit from sex trafficking. Why do these platforms think they can get away with this?
Our class action case against Pornhub is moving forward. As in our Twitter case, the federal district court judge rejected Pornhub’s claim of immunity under CDA 230. In a first-in-the-nation ruling, the federal court held that there could be no immunity for disseminating child sexual abuse material on the internet or anywhere else, as Pornhub has done.
One wonders why companies like Twitter and Pornhub allow their platforms to be used to disseminate child sexual abuse material. There is big money in it through advertising and paywalls. And, at least for Twitter, as long as the public generally remains ignorant of its business practice, there is no reputational risk.
Immediately after the publication of a December 2020 New York Times article called “The Children of Pornhub,” Pornhub deleted over 11 million videos from its site. This article was an exposé on the immense volume of child sexual abuse material on the Pornhub site. The fact that Pornhub knew exactly which 11 million videos to delete is telling. Prior to the article, there had been countless complaints to the site about child sexual abuse material with no action taken.
Twitter likes to promote that it has a “no tolerance policy” for child sexual abuse material. The reality is that there are thousands of child sexual abuse images bought, sold, and traded on Twitter every single day. Its “no tolerance policy” is a complete sham. Indeed, in its pleadings in federal court, it is asserting the position that it can do business with sex traffickers without losing its federal immunity. We will soon see what the 9th Circuit Court of Appeals thinks of that argument.
Eventually, it will take a conclusive ruling from the United States Supreme Court that CDA 230 provides no immunity for platforms like Twitter and Pornhub, both of which allow sex trafficking and child sexual abuse material on their sites. In the meantime, children like Jane and Sarah will continue to be victimized, and Pornhub and Twitter will continue to rake in profits from their images.
In the meantime, we will continue to pursue justice for survivors and work to hold online pornography companies to account.
The post The Horrifying Reality of Child Porn and its Enablers appeared first on Public Square Magazine.
Continue reading at the original source →