Downside of Google's Perceived Omniscience

July 21, 2016

google filter bubble

There is a concept at work on the Internet called a “filter bubble,” where the Internet behaves in a way that caters to your own interests. Instead of showing you everything from many viewpoints, it limits your content to the things that suit your taste. This same concept is also present in social media sites, for example, Facebook’s block function and Twitter’s following and un-following function.

At first look, this concept seems great. After all, we do want to socialize with people who have the same mindsets as us, that’s human nature. But we are also losing something very important: the chance to grow more in knowledge as we hear one another’s ideas. Now that may not always be entertaining, especially when we are proven wrong. However, wouldn’t it be better to actually know the truth?

The problem with Google is that it supposedly gives you the best information out there, but because of this filter bubble, it really gives you the best information out there that you want. Its algorithm reads your cookies and knows what you search. The engine gets to know you little by little and can predict the results you desire. Given for example the debate on whether George W. Bush sanctioned the 9/11 attack, if Google’s algorithm finds out that you don’t believe airplane fuel can burn steel beams when hot enough, then it won’t show you the article in Scientific American that debunked that claim.

According to a study by Dr. Robert Epstein, a psychologist from Harvard and former editor of Psychology Today, search engine results can affect the results of an election. The study showed that a close election can quickly swing toward the top candidate by simply putting his or her name above the search results.

We always think when we do a search that Google is an omniscient program that knows everything and can tell us everything in an unbiased fashion. But is it? Google recently announced that rather than using its current method of ranking links based on other links, it’s coming up with its own method for determining the quality and accuracy of a website. This is a scary concept and is perfectly put into words by Dr. Epstein. “That implies that a company that is accountable to nobody but their shareholders are going to make decisions on what is true and false.”

We have three options in this situation: (1) we can ask Google to publicly display its new algorithm so people can study it and be aware of its work; (2) ask Google to display biased and unbiased search results; or (3) simply trust that Google will not take advantage of us.

What are your thoughts on this?

If you want to read more on the topic, you can click here.