PornHub gets busted for profiting off child trafficking, coincidentally announces new policies

On Sunday, December 4, 2020, the New York Times published an expose about the dangerous consequences of PornHub's user-generated content:

Pornhub is like YouTube in that it allows members of the public to post their own videos. A great majority of the 6.8 million new videos posted on the site each year probably involve consenting adults, but many depict child abuse and nonconsensual violence. Because it's impossible to be sure whether a youth in a video is 14 or 18, neither Pornhub nor anyone else has a clear idea of how much content is illegal.

Unlike YouTube, Pornhub allows these videos to be downloaded directly from its website. So even if a rape video is removed at the request of the authorities, it may already be too late: The video lives on as it is shared with others or uploaded again and again.

"Pornhub became my trafficker," a woman named Cali told me. She says she was adopted in the United States from China and then trafficked by her adoptive family and forced to appear in pornographic videos beginning when she was 9. Some videos of her being abused ended up on Pornhub and regularly reappear there, she said.

"I'm still getting sold, even though I'm five years out of that life," Cali said. Now 23, she is studying in a university and hoping to become a lawyer — but those old videos hang over her.

The problem here isn't the porn itself; it's the non-consensual videos, often depicting non-consensual activity, that other people profit from. And it's not a small problem either. Some studies have placed MindGeek, Pornhub's parent company, as the third most influential technology company in the world, behind Facebook and Google but ahead of Apple, Amazon, and Microsoft. In the article, the Times cites statistics the National Center for Missing and Exploited Children that show a similar impact on child trafficking. Facebook removed 12.4 million posts earlier this year dealing with the sexual exploitation of children, and Twitter closed 264,000 accounts last year for similar reasons. But those numbers pale in comparison to the growth of such content. The Center received 6.5 million complaints about videos of files in 2015; by 2019, they were receiving more than ten times as many complaints.

PornHub helped to enable that growth. And, as an unfortunate byproduct of Section 230 of the Communications Decency Act (which is largely good, actually), there's not really any consequence for it, because, well, it's not PornHub's fault that someone used their technology to do something shitty. Or, in this case, a lot of someones used their technology to repeatedly violate other people for profit. (As far as I'm aware, PornHub has generally complied with the law when it comes to removing illegal content; and certainly, those violating videos would still exist even if they couldn't be as easily distributed.)

Four days after the Times expose, PornHub announced a new "Commitment to Trust and Safety" — the end result of internal auditing process that they swear began earlier this year, back when we were all still giggling at their offers of free premium service for people in coronavirus quarantine (oh, how quaint the world was back then). Here's a summary:

In April 2020, we retained outside experts to independently review our compliance program and make recommendations that focus on eliminating all illegal content and achieving a "best-in-class" program that sets the standard for the technology industry.

Today, we are taking major steps to further protect our community. Going forward, we will only allow properly identified users to upload content. We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report. Full details on our expanded policies can be found below.

The link includes more specific details on these changes. I'm not sure how this compares to content moderation practices on other platforms, which are not without their flaws either. At first glance, it seems like a largely positive move—though it could also lead to censoring some legitimate sexual content, like when Facebook took down anarchist content alongside white supremacists.

For now, I look forward to seeing the joint report from PornHub and the National Center for Missing and Exploited Children when it's released next year.

The Children of PornHub [Nicholas Kristof / The New York Times]

Our Commitment to Trust and Safety [PornHub]

Image: Public Domain via Pexels