A filter bubble is a concept developed by Internet activist Eli Pariser in his book by the same name to describe a phenomenon in which websites use algorithms to selectively guess what information a user would like to see based on information about the user like location, past click behaviour and search history. As a result websites tend to show only information which agrees with the user’s past viewpoint. Prime examples are Google’s personalized search results and Facebook’s personalized news stream. According to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for “BP” and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were “strikingly different.” The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal and addressable.
Filter Bubble on Wikipedia.