The near total dominance of computer search over our information gathering has presented our culture with an interesting (and possibly unique) problem in the history of information management: what you see when you search for something can be quite different from what I see when I search. This is because search engines (and other recommendation services, like the suggestions at Amazon and Netflix) monitor our behavior and tailor the results we see to their records of our past behavior. In other words, the search results we are served are based on what we have clicked on in the past, and the movie and book recommendations we receive are limited by our previous purchases. As Siva Vaidhyanathan puts it in The Googlization of Everything, this trend has led to search technologies making the world’s information “universally accessible,” as Google’s motto puts it, but it is not “making universal knowledge universally accessible” (p. 139). That is, we are living in a time when your knowledge and my knowledge, based on what search results we are served, may be very different from each other.
From the perspective of a software engineer, this personalization makes sense: Google is in the business of selling advertising, and this business is best served by giving users what they want so that they will continue to visit the site. However, from the perspective of a researcher, it can be disturbing to know that the results of a particular research session are dependent not on how one structures a search query, but on the entire history of one’s previous interactions with the Web. Vaidhyanathan suggests a possible outcome of this kind of search, that it will hinder users from encountering “the unexpected, the unknown, the unfamiliar, and the uncomfortable” (p.183); in other words, it will place us in what Cass Sunstein calls “information cocoons” that exclude what our actions have indicated we don’t wish to see.
Sunstein’s argument is referenced in In Evgeny Morozov‘s review of The Filter Bubble: What the Internet is Hiding from You, a new book by Eli Pariser that deals with the issue of personalization. While Morozov isn’t completely satisfied with Pariser’s book, he does praise it for encouraging debate over this important issue, one that I think is incredibly important for the future of digital literacy. The ability to search “universally accessible” information, and the subsequent personalization of that search experience, represent a new development in how we as a society deal with information.
According to Morozov (disclosure: I haven’t read Filter Bubble yet), Pariser’s suggests a number of solutions for this problem, most of which seem driven by regulation and top-down changes in the Internet ecosystem. While I’m sympathetic to this response in some ways—systemic, structural issues often need to be dealt with in systemic, structural ways—I’m less hopeful that any of the suggestions he provides will be viable. What can work, however, are user-based interventions. As Morozov points out, users can turn off many (but not all) of Google’s personalization features.
To put it another way, our students need to be as search-literate as Morozov appears to be. They need to know how the tools they use affect the information they see, as well as how to circumvent the effects of those tools when necessary. In short, search personalization is another example of why it is important to teach digital literacy, particularly to young students. Digital literacy isn’t an add-on to their education, but, just as with textual literacy, an integral part of how they see the world.
Banner image credit: Ian Lewis http://www.flickr.com/photos/ianlewis/69886573/