Google to Revise Search Rankings to Downplay Fake News, Hate Speech

Adjust Comment Print

The announcement in a blog post Tuesday reflects Google's confidence in a new screening system created to reduce the chances that its influential search engine will highlight untrue stories about people and events, a phenomenon commonly referred to as "fake news".

Gomes says the phenomenon of fake news, which leads to the spread of misleading, low quality, offensive or false information is different to issues it has faced in the past but that Google's aim is the same: "to provide people with access to relevant information from the most reliable sources". The company not only made changes to its algorithms but also introduced a new feedback system that collects opinion from the users about the search results.

Google's move to fight back against "low-quality" content will also reportedly equip users with the ability to flag and report offensive auto-complete search questions.

Danny Sullivan, founder of the Search Engine Land news site, said the changes made sense and should not be taken to suggest that Google's algorithms were failing to correctly index what they found online. While those people don't affect search results in real time, they do provide feedback on whether the changes to the algorithms are working, Gomes wrote.

In addition, Google is adding new public feedback tools for its AutoComplete feature in its search bar. A steady stream of examples began to cast doubt on the quality of Google search.

Officer says 'minimal but necessary force' used on United passenger
Summary: The American Airlines passenger seen in a recent viral video has hired the lawyer who represents Dr. While that is happening, passengers in the cabin are filming as security staff drag Dr Dao from his seat.

Google also rewrote its 140-page book of rating guidelines that help the quality-control evaluators make their assessments. Fortune reported that when querying the search engine for "did the Holocaust happen", the engine returned an unexpected first result: A page titled "Top 10 reasons why the Holocaust didn't happen". Those show up during a search session where Google will try to offer the most authoritative answer while it's in the process of finding your answer.

Google is using humans to evaluate the quality of search results, and to identify areas where Google needs to improve.

While Project Owl is one of several steps that Google has been taking to address the problem of fake news and offensive content, it's too early to tell whether or not it has succeeded, or will succeed in the future. "These new feedback mechanisms include various clearly labeled categories so you can inform us directly if you find critical or unhelpful content".

Over the last few months, Google, along with Facebook and other digital platforms, has struggled to keep hoaxes and fake news stories from appearing in search. Feedback will be used to improve Google's algorithms.

Comments