Home Technology Content Moderators Must Be Protected And Work In A Safe Environment

Content Moderators Must Be Protected And Work In A Safe Environment

When you purchase through our sponsored links, we may earn a commission. By using this website you agree to our T&Cs.

Content moderators at tech leaders Facebook, Inc. (NASDAQ:FB) and Cognizant Technology Solutions Corp (NASDAQ:CTSH) are breaking their NDAs to expose shocking working conditions, while one investigative reporter portrayed content moderators as being traumatized at “sweatshops in America.”

One content moderation company, WebPurify with offices in Irvine, California, and Hyderabad, India says they’re starting to see a shift as companies are carefully considering the work conditions of their moderation partners.

[REITs]

Q2 hedge fund letters, conference, scoops etc

WebPurify provides content moderation expertise for many of the world’s most respected organizations and Fortune 500 companies, including top e-commerce platforms, children’s sites, dating sites, social sharing, and gaming apps.

Joshua Buxbaum, the co-founder of WebPurify, says that content moderators are providing an essential service to the online community and deserve better.

“A safe and supportive work environment has always been a core value of our business, so we are pleased to see companies beginning to focus on work conditions vs. solely looking at cost,” he says.

WebPurify’s content moderation team takes pride in what they do and recognize that they are on the frontlines protecting the public.

And protecting a company’s image hangs in the balance.

Buxbaum says his team often see terrible things and they bear the brunt of it to protect strangers on the other side of that screen.

However, they’re not too prideful to take advantage of the various mental health programs WebPurify has in place.

He says moderators are often rotated to less severe projects to take a break from seeing the “bad stuff” every day.

“We realize this is a unique and hazardous job and at WebPurify we’ve built the safest environment we could to protect our moderation team best,” he says.

Buxbaum points out that the cost of moderation is a difficult pill for some businesses to swallow because the expense doesn’t translate directly to profits.

The use of AI isn’t the answer to spare moderators from the risks of psychological trauma because this technology used alone, without live moderation, often comes up short when moderating user-generated content to keep brands protected.

The result is some company’s seeking the most inexpensive solution while disregarding the quality of the reviews and working conditions of their moderators.

Buxbaum says he’s pleased with the shift in the past year, and as the media focuses on this issue, companies are now asking potential moderation partners the critical questions.

“We’re happy to answer questions about the details of our mental health program for our moderators, requesting a tour of our facilities, or asking for information about our quality control measures,” he says.

He adds that it’s ironic that the purpose of a moderation team is to protect a company’s reputation. However, by disregarding the working conditions of those moderators, some company’s reps suffer tremendously.

Article by WebPurify

Our Editorial Standards

At ValueWalk, we’re committed to providing accurate, research-backed information. Our editors go above and beyond to ensure our content is trustworthy and transparent.

Jacob Wolinsky
Editor

Want Financial Guidance Sent Straight to You?

  • Pop your email in the box, and you'll receive bi-weekly emails from ValueWalk.
  • We never send spam — only the latest financial news and guides to help you take charge of your financial future.