Pornhub’s Instagram Account Taken Down After Concerns About Site Content—Here’s What We Know

Pornhub’s Instagram Account Taken Down After Concerns About Site Content—Here’s What We Know

Pornhub’s Instagram Account Taken Down After Concerns About Site Content—Here’s What We Know


The topline

Instagram has removed the official account of Pornhub—one of the world’s most popular adult sites—from its platform, the latest escalation as tech giants crack down on purveyors of adult content amid mounting public pressure and allegations of widespread abuse and illegal content online.

The Key Facts

Pornhub’s Instagram account had some 13.1 million followers and more than 6,200 posts when it was removed, according to VarietyThis is the first time that this news was reported by.

The account did not post pornographic content to Instagram—which would clearly violate the platform’s no nudity rule—and it is not immediately clear why Pornhub’s account was removed.

Screenshots shared on Twitter by an anti-Pornhub campaigner indicate the account was removed for violating Instagram’s community guidelines following a report filed on June 1.

The report is not independently verified. It does not provide any details about the policies that were broken, the time they occurred or what content caused the platform to take action.

Meta, Instagram’s parent company, and Pornhub did not immediately respond to SME’Request for confirmation or comment.

Keep an eye out for these things

Another tech company is taking steps against Pornhub. Pornhub maintains highly popular Twitter and YouTube accounts, with 3.4 million followers each and 882,000 subscribers respectively. SMETwitter and Google (which own YouTube) have been contacted for comment.

The Key Background

Big tech platforms are under increasing pressure to address the flood of illegal online content. Sites are trying to curb violent, misleading, and false content, such as child pornography and revenge porn, which has increased in recent years. Even though big tech has made great efforts, problems still persist. Companies have begun to take steps to improve their platforms by banning users and stopping them from hosting. However, efforts are not being made and the issue is still unsolved. Pornhub was the target of numerous complaints about illegal content and its top executive quit the company. OnlyFans, the biggest adult creator website, is rumored to have struggled with preventing underage users from making explicit videos or selling them. The Verge recently found that Twitter executives know the need for huge investments to eliminate illegal content, but they are not taking the necessary steps. Visa—which, alongside Mastercard, has suspended payments for ad purchases on Pornhub—has been accused of knowingly facilitating the spread of child pornography on Pornhub as part of a lawsuit against the site’s parent, MindGeek (the payment firm disagrees strongly).


TikTok’s content moderators are trained on uncensored, sexually explicit images of children, former contractors told SME. Former employees of companies contracted by the platform to assess its content said the material, alongside other content deemed to violate the platform’s policies like images of children being abused, was freely available to hundreds of people.

Additional Reading

TikTok Moderators are being trained using graphic images of child sexual abuse (SME).

How Twitter’s Child Porn Problem Ruined Its Plans For An Onlyfans Competitor (The Verge)

Shadowbanning Is Big Tech’s Big Problem (Atlantic)

In the dark world of trading nudes, (BBC)

Source link