During a May 2011 Technology, Entertainment, and Design (TED) talk, Pariser called this phenomenon the “invisible algorithmic editing of the web.” He warned that it results in a situation where “the internet shows us what it thinks we want to see, but not necessarily what we need to see,” adding: “Instead of a balanced education diet, [we] end up with information junk food.”
In the early days of the internet, it was believed that the web would widen our connections with the world and expose us to new perspectives, Pariser said: Instead of being limited to the newspapers, books, and other writings available in our local communities, we would have access to information from all over the globe. But thanks to these new search-engine formulas, he said, the internet instead is coming to represent “a passing of the torch from human gatekeepers [of information] to algorithmic ones.”
Yet, algorithms don’t have the kind of embedded ethics that human editors have, he noted. If algorithms are going to curate the world for us, then “we need to make sure they’re not just keyed to [personal] relevance—they also should show us things that are important, or challenging, or uncomfortable.”
The new web-search rules affect not just how we perceive the world, but also how we shape it.
If students, researchers, and educators want their writings, videos, websites, and other online works to appear near the top of an internet search, they’ll have to understand how these new rules work in order to take advantage of them, says Angela Maiers, an educator, writer, consultant, and professional trainer. And that’s important, she says, because web surfers rarely click beyond the first or second page of search results.