The new guidelines specifically call out the kind of misleading info, "unexpected offensive results, hoaxes and unsupported conspiracy theories" that have been the subject of criticism and anger when found returned as a result of relatively innocuous queries by Google Search users.
In addition, it has made changes to its search signals that will also bring up more accurate content in its results, and demote low-quality content.
After today, a user who spotted an offensive autocomplete result will be able to flag it for Google's engineers to review. The American company has also updated guidance to its employees who evaluate the quality of results produced by... Google executives claimed the type of web pages categorized in this bucket are relatively small, which is a reason why the search giant hadn't addressed the issue before.
The company's main search product has similarly been accused of spreading extremism: in late 2016, a search for "did the Holocaust happen" gave, as its first result, a link to Holocaust denial on racist website Stormfront.
In particular, he said, many groups and organisations were using "fake news" to help spread "blatantly misleading, low quality, offensive or downright false information". It's also lost lawsuits in Japan and Germany over the search suggestions.
Since the United States election, several companies including Google and Facebook have taken steps to deal with false information that looks legitimate being passed through their products. For instance, if Google guaranteed that flagging content would remove a search result, unethical users could wield a "banhammer" to block content they didn't like or in order to favour their own content. While those people don't affect search results in real time, they do provide feedback on whether the changes to the algorithms are working, Gomes wrote.More news: Trump Reelection Effort Raised $13.2 Million So Far In 2017
Vice president of engineering at Google Search, Ben Gomes, admits that people have been trying to "game" the system - working against the spirit of the objective of algorithms - to push poor-quality content and fake news higher up search results.
Only about 0.25 percent of Google's search results were being polluted with falsehoods, Gomes said.
Questions are bound to be raised about whether this panel, which Google says is representative of its users, is impartial and objective.
It will also be taking help of users, by allowing them to flag faulty Featured Snippets and Autocomplete predictions.
Content problems with Google's featured snippets may even be more serious.