“Somewhere within Google there exists a master list of ‘bad words’ and evil concepts that Google Instant is programmed to not act upon, lest someone see something offensive in the instant results ... even if that's exactly what they typed into the search bar. We call it Google Blacklist,” the list’s intro says.
Google Instant blocks certain words, meaning the search engine stops auto-completing the word and won’t predict and display results until after the user hits enter.
According to reports, some mild words like “lesbian” or “butt” also fail to appear and Google responded that the reason they’re blocked is to protect children from displaying possibly pornographic or offensive results.
"There are a number of reasons you may not be seeing search queries for a particular topic,” a Google representative said in a statement. “Among other things, we apply a narrow set of removal policies for pornography, violence, and hate speech. It's important to note that removing queries from Autocomplete is a hard problem, and not as simple as blacklisting particular terms and phrases.
In search, we get more than one billion searches each day. Because of this, we take an algorithmic approach to removals, and just like our search algorithms, these are imperfect. We will continue to work to improve our approach to removals in Autocomplete, and are listening carefully to feedback from our users.”
The rep said that the algorithms look at compound queries based on search terms and across all languages.
For those concerned with the safety of searches, Google offers SafeSearch, which filters out potentially offensive results after a user hits enter, only displaying completely innocuous search terms on Google’s first page.
2600’s Google Blacklist can be viewed here.