A filter bubble is the tendency for personalization of information to lead an individual to information sources that strictly conform to their world view. This potentially allows one-sided views of information to thrive over neutral accounts that explore both sides of complex issues. The following are illustrative examples.
Personalized SearchA search engine may try to guess the articles and media outlets that a person will find most interesting based on algorithms that learn from the user's search history. As a theoretical example, an individual who searches for "environment" may be receive a top result that lists environmental issues. Another individual who searches for "environment" may be shown an article discussing the costs of environmental regulations to businesses.
Social MediaPeople have a tendency to follow media outlets that conform to their views on issues. Media outlets may have incentive to be extremely consistent on issues in order to gain and retain followers who agree with them. This may be detrimental to journalism ethics and standards that traditionally valued presenting both sides of every story.
Personalized Information StreamsPersonalized information streams on social media and other platforms may use information such as the articles shared by friends to generate streams of content. Such streams may have a tendency to conform to an individual's views on issues.
This is the complete list of articles we have written about cognitive biases.
If you enjoyed this page, please consider bookmarking Simplicable.
© 2010-2023 Simplicable. All Rights Reserved. Reproduction of materials found on this site, in any form, without explicit permission is prohibited.
View credits & copyrights or citation information for this page.