Google takes steps to block ‘fake news,’ omit derogatory ‘autocomplete’ suggestions
The Google logo is seen at the company’s offices in Brussels on March 23, 2010. (Virginia Mayo / Associated Press)
Google has sprinkled some new ingredients into its search engine in an effort to prevent bogus information and offensive suggestions from souring its results.
The changes have been in the works for four months, but Google hadn’t publicly discussed most of them until now. The announcement in a blog post Tuesday reflects Google’s confidence in a new screening system designed to reduce the chances that its influential search engine will highlight untrue stories about people and events, a phenomenon commonly referred to as “fake news.”
“It’s not a problem that is going to go all the way to zero, but we now think we can stay a step ahead of things,” said Ben Gomes, Google’s vice president of engineering for search.
Besides taking steps to block fake news from appearing in its search results, Google also has reprogrammed a popular feature that automatically tries to predict what a person is looking for as a search request as being typed. The tool, called “autocomplete,” has been overhauled to omit derogatory suggestions, such as “are women evil,” or recommendations that promote violence.
Google also adding a feedback option that will enable users to complain about objectionable autocomplete suggestions so a human can review the wording.
Facebook, where fake news stories and other hoaxes have widely circulated on its social network, also has been trying to stem the tide of misleading information by working with the Associated Press and other news organizations to review suspect stories and set the record straight when warranted. Facebook also has provided its nearly 2 billion users ways to identify posts believed to contain false information, something that Google is now allowing users of its search engine to do for some of the news snippets featured in its results.
Google began attacking fake news in late December after several embarrassing examples of misleading information appeared near the top of its search engine. Among other things, Google’s search engine pointed to a website that incorrectly reported then President-elect Donald Trump had won the popular vote in the U.S. election, that President Obama was planning a coup and that the Holocaust never occurred during World War II.
Only about 0.25% of Google’s search results were being polluted with falsehoods, Gomes said. But that was still enough to threaten the integrity of a search engine that processes billions of search requests per day largely because it is widely regarded as the Internet’s most authoritative source of information.
“They have a lot riding on this, reputation-wise,” said Lucy Dalglish, who has been tracking the flow of false information as dean of the University of Maryland’s journalism department. “If your whole business model is based turning up the best search results, but those results turn up stuff that is total crap, where does that get you?”
To address the problem, Google began revising the closely guarded algorithms that generate its search with the help of 10,000 people who rate the quality and reliability of the recommendations during tests. Google also rewrote its 140-page book of rating guidelines that help the quality-control evaluators make their assessments.
Fighting fake news can be tricky because in some cases what is viewed as being blatantly misleading by one person might be interpreted as being mostly true by another. If Google, Facebook or other companies trying to block false information err in their judgment calls, they risk being accused of censorship or playing favorites.
But doing nothing to combat fake news would probably have caused even bigger headaches.
If too much misleading information appears in Google’s search results, the damage could go beyond harm to its reputation for reliability. It could also spook risk-averse advertisers, who don’t want their brands tied to content that can’t be trusted, said Larry Chiagouris, a marketing professor at Pace University in New York.
“Fake news is careening out of control in some people’s eyes, so advertisers are getting very skittish about it,” Chiagouris said. “Anything Google can do to show it is trying to put a lid on it and prevent it from getting out of hand, it will be seen as a good thing.”
Although it also sells ads on its other services and independently owned websites, Google still makes most of its money from the marketing links posted alongside its search results. Google says its new approach isn’t meant to placate advertisers.