Latest In

News

PornHub Moderators Were Required To Assess 700 Videos Daily

Pornhub moderators were required to assess 700 videos daily. A former moderator for the adult website Pornhub has revealed that they were expected to review up to 700 videos a day, and sometimes even more.

Maxwell Canvas
Mar 17, 202320 Shares383 Views
Pornhub moderators were required to assess 700 videos daily.A former moderator for the adult website Pornhub has revealed that they were expected to review up to 700 videos a day, and sometimes even more.
The anonymous moderator, who worked for the company for two years, spoke out in the documentary 'Money Shot: The Pornhub Story', which explores the company's rise to prominence and the controversies that have surrounded it.

What Is The Role Of A Pornhub Moderator?

PornHub is a popular adult website that was launched in 2007. It was founded by a group of Canadian developers who saw an opportunity to create a platform for adult content that was free and easily accessible.
Today, PornHub is one of the largest adult websites in the world, with over 120 million daily visitors and millions of videos available for streaming.

The Moderation Process On The Platform

The moderation process on PornHub is designed to ensure that all content on the site is legal, ethical, and meets community standards. This is done by a team of moderators who review every video that is uploaded to the platform.
When a user uploads a video to PornHub, it is automatically scanned for potential violations such as underage content, non-consensual content, and content that violates copyright laws. If the video passes the initial scan, it is then sent to the moderation team for further review.

The Role Of Moderators In Assessing Videos

Moderators play a crucial role in assessing videos on PornHub. Their primary responsibility is to ensure that all content on the platform meets community standards and is legal. This involves watching every video that is uploaded to the platform and checking it for potential violations.
The moderators are trained to identify and flag any content that violates the community standards or is illegal. This includes content that features minors, non-consensual acts, violence, and other forms of abuse.
If a video is found to violate community standards or is illegal, it is immediately removed from the platform.
In addition to assessing videos for violations, moderators also respond to user reports of potential violations. They investigate these reports and take appropriate action, which may include removing the video and/or banning the user who uploaded it.
Overall, the role of moderators on PornHub is critical to ensuring that the platform remains a safe and ethical place for users to enjoy adult content.

Pornhub Moderators Were Required To Assess 700 Videos Daily

PornHub moderators were required to assess 700 videos daily. This means that each moderator had to review, categorize, and flag at least 700 videos in a single day.
The requirement of assessing 700 videos daily was put in place to keep up with the enormous amount of content being uploaded to the platform every day.
PornHub receives over 6 million uploads per year, and with such a large volume of content, the moderation team had to work quickly to ensure that all videos were checked for potential violations and removed if necessary.
The daily assessment requirement of 700 videos had a significant impact on the moderators who were responsible for reviewing the content.
The job was mentally and emotionally challenging, with moderators being exposed to graphic and potentially traumatic content on a daily basis.
Moderators reported feeling overwhelmed, stressed, and traumatized by the constant exposure to violent, non-consensual, and illegal content. They also reported a lack of support from their employers, inadequate training, and high turnover rates.
This requirement also made it difficult for moderators to take the time needed to properly assess each video.
With such a large number of videos to review each day, there was little room for error, and moderators were forced to work quickly, potentially leading to missed violations.

Pornhub Allegations Of Non-Consensual Content And Sex Trafficking Amid Moderation Challenges

The documentary explains that after launching in 2007, Pornhub became a major platform for erotic content creators, and the company made billions of dollars in revenue. However, it has also been accused of hosting non-consensual material and contributing to sextrafficking.

Money Shot: The Pornhub Story | Official Trailer | Netflix

The former moderator shared that every moderator had to review hundreds of videos per day, and sometimes more. They said, "Even if we thought that we were being diligent with our work, we would still miss a few videos every now and then."
According to the senior legal counsel of the non-profit organization National Center on Sexual Exploitation Dani Pinter, each moderator was tasked with viewing 800 to 1000 videos per eight-hour shift.
Pinter said that it was impossible to review that many videos thoroughly, and many moderators were forced to fast-forward through videos or skip through them entirely.
The whistleblower also revealed that there was a backlog of videos that needed to be taken down, and some videos that violated the site's terms of service remained online for months. They said, "We don’t really go through them in time. Many videos that should have been taken down stayed up for months."
A spokesperson for MindGeek, the parent company of Pornhub, shared a statement defending the company's policies and claiming that they have "zero tolerance for illegal material." They also claimed that every user who uploads content to their platforms must provide a government-issued ID that passes third-party verification.
However, critics argue that the sheer volume of content on Pornhub makes it difficult to adequately moderate the site. Facebook, which is not primarily centered around sexually explicit content, has 15,000 moderators, while Pornhub reportedly had just 30 moderators at the time of the former moderator's employment.

People Also Ask

How Does Pornhub Ensure That Videos Featuring Underage Individuals Are Not Uploaded To The Site?

Pornhub uses a combination of automated scanning and manual moderation to ensure that videos featuring underage individuals are not uploaded to the site. Any videos that are suspected of containing underage content are immediately removed from the platform.

Are Moderators Of Pornhub Provided With Support And Resources To Help Them Cope With The Potentially Traumatic Content They Are Exposed To?

Reports suggest that moderators of Pornhub have not been provided with adequate support and resources to help them cope with the potentially traumatic content to they are exposed to. This has led to concerns over the mental health of moderators.
Pornhub has a policy of removing all videos that feature revenge porn and non-consensual content. Users can also report such videos to the moderation team, who will investigate and remove the content if necessary.
Yes, Pornhub has faced legal action related to its moderation practices. In February 2021, a class-action lawsuit was filed against the company, alleging that it had hosted and profited from non-consensual content.

How Long Does It Take For Pornhub To Remove Requested Content?

According to a whistleblower, there was a backlog of "six to eight months" of videos that were requested to be taken down, suggesting a lengthy process for content removal.

Has Pornhub Made Any Changes To Their Moderation Policies In Response To Criticism?

Yes, Pornhub has made changes to its moderation policies in response to criticism. In December 2020, the company announced that it would be banning unverified users from uploading content and removing all unverified content from the site.

Conclusion

As the documentary highlights, the fightagainst illegal content on the internet is a complex and ongoing issue. While companies like Pornhub may claim to have strict policies in place, the sheer scale of their operations means that monitoring content is an enormous challenge. As Pinter notes, "The rules constantly changed."
The fact that Pornhub moderators were required to assess 700 videos daily highlights the scale of the challenge in monitoring content on such platforms.
In addition to the complexity of monitoring content, the issue of illegal content on the internet is further complicated by the lack of consistent and enforceable legislation across different jurisdictions.
As a result, companies like Pornhub have faced criticism for not doing enough to prevent the spread of illegal content on their platforms.
Ultimately, it is up to companies like Pornhub and governing bodies to work together to create more effective and ethical moderation policies that prioritize the protection of vulnerable individuals and communities.
Jump to
Latest Articles
Popular Articles