Home Technology Google Enlists Humans To Assist Its Algorithm With Offensive Content

Google Enlists Humans To Assist Its Algorithm With Offensive Content

When you purchase through our sponsored links, we may earn a commission. By using this website you agree to our T&Cs.

As computers get smarter and smarter, big tech names like Google are starting to realize that there are just some things that only humans can do, like identify a fake news story. The search giant is now having to revert back to human beings to tag content that people may find to be offensive because, after all, computer algorithms don’t have feelings and can’t understand what might upset people.

Google hires independent contractors

As massive as Google is, it still uses independent contractors. The search giant works with about 10,000 independent contractors who hold the title of “quality rater,” according to told Search Engine Land. These contractors are given searches to conduct based on actual queries and then are tasked with scoring the results based on the company’s provided guidelines.

On Tuesday, a new section was added to the list of areas for the contractors to look for: “upsetting” and “offensive” content like hate speech or violence against a specific group of people. Other types of content included in this category are racial slurs, offensive terms, graphic images of child abuse or cruelty to animals, and explicit details about activities like human trafficking.

The point of all this is to actually provide helpful information from sites that can be trusted, especially in the case of searches like “Did the Holocaust happen?” Clearly, some websites are run by operators who would use hate speech on a topic like this or spread lies about it.

Google also concerned about fake news

This seems like a different version of the realization Facebook made recently when algorithms were picking up fake news and circulating it just like any other content. However, Google is avoiding the term “fake news,” a Google engineer told Search Engine Land because they feel it is “too vague.” The company is trying to target “demonstrably inaccurate information” and prevent it from keeping factual information among the top results in searches.

The contracted “quality raters” can’t actually make direct changes to the search results. They just rate the pages that show up toward the top of the results in terms of quality. When one of them marks a result as low quality, the webpage doesn’t automatically plummet in the search rankings, according to Search Engine Land. Instead, the data they produced is used to improve the search algorithms.

Our Editorial Standards

At ValueWalk, we’re committed to providing accurate, research-backed information. Our editors go above and beyond to ensure our content is trustworthy and transparent.

Michelle Jones
Editor

Want Financial Guidance Sent Straight to You?

  • Pop your email in the box, and you'll receive bi-weekly emails from ValueWalk.
  • We never send spam — only the latest financial news and guides to help you take charge of your financial future.