More than you might think. In this short but sweet presentation at TED, Eli Pariser reveals that the major search engines, content aggregators and portals are filtering what you see on an unprecedented level.
Pariser argues very effectively that we're passing the torch from human editors to computer algorithms that end up providing not what we should see, but what we want to see.
On the surface, this may seem like the ideal environment. This is the brand new semantic web that learns from us. What's wrong with having our experience of the Internet centered around our wants and desires?
The dark side of this that we may be overlooking is that instead of creating a more semantically-organized Internet that draws us together through common interests, we are allowing these search algorithms to create a Web centered on isolation. Instead of creating connections, we create little pockets of space in which we operate, blind to what is really going on. Our filter determines our world, instead of vice versa.
Of course, this isn't a situation new to the Internet. It can happen in almost any kind of media, from the printed word to television, radio, and music. If we only take in what we're interested in or that on the surface conforms to our worldview, we'll invariably obtain a kind of myopia toward anything that doesn't fit within our little bubbles. This is why it is so dangerous to get your news from one source, for example.
On the Internet, however, our myopia can affect more than just ourselves. The biggest danger I see from this is that Google, Facebook, Yahoo News, and other content aggregators and media portals take this information about what we personally find interesting or acceptable and apply it to groups of people. The content provider thus becomes the content gatekeeper, and if something doesn't conform to the 'hive mind,' it doesn't get through.