PornHub war on porn

bosco

Administrator
Staff member
NEW YORK — Pornhub has released two statements about yesterday’s announcement by Visa and Mastercard that their cards would no longer be accepted on the platform, following an editorial by Nicholas Kristof published by the New York Times, which makes a number of allegations against the company.

These actions are exceptionally disappointing, as they come just two days after Pornhub instituted the most far-reaching safeguards in user-generated platform history. Unverified users are now banned from uploading content — a policy no other platform has put in place, including Facebook, which reported 84 million instances of child sexual abuse material over the last three years. In comparison, the Internet Watch Foundation reported 118 incidents on Pornhub over the last three years.

This news is crushing for the hundreds of thousands of models who rely on our platform for their livelihoods.


Regarding Visa and Mastercard, Pornhub’s statement follows:

Regarding the allegations by Kristof published by the New York Times, Pornhub’s statement follows:

Eliminating illegal content and ridding the internet of child sexual abuse material is one of the most crucial issues facing online platforms today, and it requires the unwavering commitment and collective action of all parties.

Due to the nature of our industry, people's preconceived notions of Pornhub's values and processes often differ from reality — but it is counterproductive to ignore the facts regarding a subject as serious as CSAM. Any assertion that we allow CSAM is irresponsible and flagrantly untrue. We have zero tolerance for CSAM. Pornhub is unequivocally committed to combating CSAM, and has instituted an industry-leading trust and safety policy to identify and eradicate illegal material from our community.

According to leading non-profits, advocates and third-party analyses, Pornhub’s safeguards and technologies have proven effective: while platforms intended to be family friendly like Facebook reported that it removed 84,100,000 incidents of CSAM over two and a half years, Instagram reported that it removed 4,452,000 incidents of CSAM over one and a half years, and Twitter reported that it suspended 1,466,398 unique accounts for CSAM over two years, the Internet Watch Foundation, the leading independent authority on CSAM, reported 118 incidents of CSAM on Pornhub in a three year period.

Pornhub has actively worked to employ extensive measures to protect the platform from such content. These measures include a vast team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and a variety of automated detection technologies. These technologies include:

CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online
Content Safety API, Google's artificial intelligence tool that helps detect illegal imagery
PhotoDNA, Microsoft’s technology that aids in finding and removing known images of child exploitation
Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform
Pornhub also attached the figures provided by Internet Watch Foundation comparing CSAM found in other platforms:

Facebook: 84,100,000

Instagram: 4,452,000

Twitter: 1,466,398 - You can navigate through the old reports near the top on the right hand side, adding up the totals from the last two years.

Pornhub: 118

Note the quote from IWF in this piece — Internet Watch Foundation (IWF), which identifies and removes child sexual abuse imagery online, said it found 118 cases of child abuse on Pornhub from 2017-2019 but that this number was low and Pornhub quickly removed this content.

"Everyday sites that you and I might use as social networks or other communications tools, they pose more of an issue of child sexual abuse material than Pornhub does,” said IWF spokeswoman Emma Hardy.

Gustavo Turner for Xbiz
 

skylarmaexo

Jr. Member
Personally, I have different take on this issue. Many adult industry companies are ran by foolish people who don't want to see the truth. Arrogance and bragging always cost companies their payment processor. I saw this back in 2006 when I was an affiliate for rape porn sites and governments around the world shut them down. Because they wanted to be too "extreme" with their content and be aggressive with their marketing on MySpace. I knew they would get shut down and drain out my affiliate accounts just in time.

Many of top adult officials was cozy up with extreme right wing or left wing groups. A lot of them brag about on forums such as GFY.com. Only to be surprise when those people stab them in the back and make harder for all of us.Also Visa and MasterCard believe in approving "safe adult content" only. Pornhub knows this but wanted to tempt fate anyways.

Pornhub did this to themselves after years of allowing revenge porn and sex trafficking victims' videos on their site. They knew the anti-porn lawmakers was gunning for them but their foolishness cost them their payment processor. They should follow the same rules that payment processors told me. Sad but true?
 
Top