“Filter bubble” refers to the fact that it’s harder and harder for individuals to hear about perspectives and sides that aren’t their own already – specifically, search algorithms maximize clicks and engagement by prioritizing results from sources or people you might already trust – Google included. The result is clear:
Pariser used the example of two people who googled the term “BP”. One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm.
Technology Review points to research being done at Yahoo Labs to counter this. Researchers recommended tweets with countering perspectives to certain users for specific issues, and assessed engagement through clicks.
[B]ecause this is done using their own interests, they end up being equally satisfied with the results (although not without a period of acclimitisation)…
The results show that people can be more open than expected to ideas that oppose their own. It turns out that users who openly speak about sensitive issues are more open to receive recommendations authored by people with opposing views, say Graells-Garrido and co.
I’m glad that this is working, and would definitely like to see this implemented in more social networks, as interesting it is to see how biased people are able to get in spite of being connected to the internet.