The Filter Bubble: how personalization changes society

MoveOn co-founder Eli Pariser's new book The Filter Bubble: What the Internet Is Hiding from You is a thoughtful, often alarming look at the dark side of Internet personalization. Pariser is concerned that invisible "smart" customization of your Internet experience can make you parochial, exploiting your cognitive blind-spots to make you overestimate the importance or prevalence of certain ideas, products and philosophies and underestimate others. In Pariser's view, invisible, unaccountable, commercially driven customization turns into a media-bias-of-one, an information system that distorts your perception of reality. Pariser doesn't believe that this is malicious or intentional, but he worries that companies with good motives ("let's hide stuff you always ignore; let's show you search results similar to the kinds you've preferred in the past") and bad ("let's spy on your purchasing patterns to figure out how to trick you into buying stuff that you don't want") are inadvertently, invisibly and powerfully changing the discourse.

Pariser marshalls some good examples and arguments in favor of this proposition. Students whose teachers believe they are stupid end up acting stupid -- what happens when the filters decide we're dumb, or smart, or athletic, or right wing, or left wing? He cites China and reiterates the good arguments we've heard from the likes of Rebecca McKinnon: that the Chinese politburo gets more political control over the way it shapes which messages and arguments you see (through paid astroturfers) than by mere censorship of the Internet. Pariser cites research from cognitive scientists and behavioral economists on how framing and presentation can radically alter our perception of events. Finally, he convincingly describes how a world of messages that you have to consciously tune out is different from one in which the tuning out is done automatically -- for example, if you attend a town hall meeting in which time is taken up with discussion of issues that you don't care about, you still end up learning what your neighbors care about. This creates a shared frame of reference that strengthens your community.

Pariser also points out -- correctly, in my view -- that filtering algorithms are editorial in nature. When Google's programmers tweak and modify their ranking algorithm to produce a result that "feels" better (or that users click on more), they're making an editorial decision about what sort of response they want their search results to evince. Putting more-clicked things higher up is an editorial decision: "I want to provide you with the sort of information whose utility is immediately obvious." And while this is, intuitively, a useful way to present stuff, there's plenty of rewarding material whose utility can't be immediately divined or described (I thought of Jonah Lehrer's How We Decide, which describes an experiment in which subjects who were asked to explain why they liked certain pictures made worse choices than ones who weren't asked to explain their preferences). When we speak of Google's results as being driven by "relevance," we act as though there was a platonic, measurable, independent idea of "relevance" that was separate from judgment, bias, and editorializing. Some relevance can't be divined a priori -- how relevant is an open window to Fleming's Petri dish?

There were places where I argued with Pariser's analysis, however. On the one hand, Pariser's speculation about the future seems overly speculative: "What if augmented reality as presently practiced by artists and futurists becomes commonplace?" On the other hand, Pariser's futures are too static: He presumes a world in which filtering tools become increasingly sophisticated, but anti-filtering tools (ad-blockers, filter-comparison tools, etc) remain at present-day levels. The first wave of personalization in the Web was all about changing how your browser displayed the information it received; the trend to modular, fluid site-design built around XML, CSS, DHTML, AJAX, etc, makes it even more possible to block, rearrange, and manage the way information is presented to you. That is, even as site designers are becoming increasingly sophisticated in the way they present their offerings to you, you are getting more and more power to break that presentation, to recombine it and filter it yourself. Filters that you create and maintain are probably subject to some of the dangers that Pariser fears, but they're also a powerful check against the alarming manipulation he's most anxious about. Pariser gives short shrift to this, dismissing the fact that the net makes it theoretically easier than ever to see what the unfiltered (or differently filtered) world looks like with hand-waving: the filters will make it so we don't even want to go outside of them.

I don't believe that anti-filters or personal filters will automatically act as a check against manipulative customization, but I believe that they have this potential. The Filter Bubble is mostly a story about potential -- the potential of filtering technology to grow unchecked. And against that, I think it's worth discussing (and caring about, and working for) the potential of a technological response to that chilling future.

The Filter Bubble: What the Internet Is Hiding from You


  1. “Putting more-clicked things higher up is an editorial decision: “I want to provide you with the sort of information whose utility is immediately obvious.” And while this is, intuitively, a useful way to present stuff, there’s plenty of rewarding material whose utility can’t be immediately divined or described.”

    So you search through a bazillion links that everybody clicks a lot. The alternative is to search through a bazillion links that nobody ever clicks. Either way it might take years to find something you really want. But it’s still better than card catalogs ever were.

    1. “So you search through a bazillion links that everybody clicks a lot. The alternative is to search through a bazillion links that nobody ever clicks.”

      I think part of the concern is that whether everyone or no one clicks on a link is both cause and effect of the search/sorting algorithm. It’s self-reinforcing. Frankly, that’s not so different than in ages past- it’s hard for a book to get noticed unless prominent people, reviewers, and publishers find it and promote it. If anything the internet has much *less* such a priori selection bias. However, pre-internet, it was obvious that the selection effect existed and was a function of deliberate human choice to present certain works and not others. Online, the filtering is hidden, and not obvious to most people.

      1. Online, the filtering is hidden, and not obvious to most people.

        I think this is key. People *think* they’re getting unlimited information, but in fact they’re only seeing a small subset.

  2. Maybe I’m just grouchy this morning, but I have grown sorely tired of these hyperbolic “predictive” rants about the Internet. Various technology pundits have been making similar predictions about the effects of the Internet on our intelligence, our communities, the way we think, etc. etc. Go back that far and read some of what Nicholas Negroponte wrote in old issues of Wired. Negroponte is a smart man, but he was also very wrong on a lot of technology trends he pontificated on. The parts of the web that I end up on, as they come to me through RSS feeds and twitter and email are not at all filtered. Maybe my searches are filtered but if I’m investigating something factual, you can bet that I don’t stop my research after the first search page result–or use only one search engine–or just wikipedia. I think the plain and simple argument against Pariser here is that the Internet is what you make of it. Use it superficially and you will no doubt miss any truth contained within it.

  3. i only scanned your take on the book, but I think you are missing the bigger point, that the Internet also filters “culture” – we’ll use this site as an example. It blows up during the early 2000’s only to become one of the top blogs out there. Therefor, whatever you lot are into (your interest filter) becomes what others are given to read about, in part. Two cases in point, your ukelele obsession a while back ultimately led to an internet meme and eventually women on Cialis ads playing ukeleles.
    the other example, “steampunk” is now a mainstream word of sorts, due to this site.
    I also remember some posts you made about a great grafitti artist named Blu – well his shit is ripped off now beyond belief – I saw a Future shop ad that was a direct ripoff of his work, but done in a computer, and pasted over the buildings in a pseudo-grafitti way that was embarassing and lame (to anyone who knew the difference), but also ‘trend-setting’ to those who don’t… he never got paid, and his idea was bastardized and sold. Ad creatives can scroll your pages and find ideas to rip and turn into mainstream ‘culture’.
    Extrapolate this out further and it shows that sites like this become like early TV’s big three – if that’s what we see out here in audience land, then that’s all we know.

  4. While i have taken to blocking suspected shills and astroturfers on various forums i frequent, i worry that doing makes their message go uncontested in the eyes of third party readers. As such their claims may appear to have more validity then they do.

  5. This attitude is one of the best ideas we’re inherited from hacker culture: “we’re technologists we should fix it,” as Eben Moglen said about facebook in his Freedom in the Cloud talk last year.

    I’d like to suggest, as a corollary, that many of the powerful technological tools that have been developed over the past couple of decades, like wikis, version control, blogs, mailing lists, standards compliant html, and so on, are also social hacks. They work because people have figured out a way to work together for something they want, as much as because of clever engineering solutions.

    So the technological response to an internet with blinders will go hand in hand with a social response.

  6. That’s exactly the reason I don’t use accounts with news aggregators and such; the Web already has enough of a tendency to degrade into an echo chamber. The last thing I need is my own personal echo chamber.

    1. “That’s exactly the reason I don’t use accounts with news aggregators and such”

      You say, using your BoingBoing account …

  7. The focus is upon the internet; however, many of the same issues can arise with equal or greater significance in the wider realm of “offline” identity management. To put it another way, biometrics and other identity management techniques can put us “online” most of the time. What will be the social, economic, psychological and political ramifications of relatively continuous identity management and related vulnerability manipulation, as well as the intimidation that flows from such surveillance? We exit our home and we soon pass the first of many video security cameras. As we travel in our car, the toll pass on the windshield is displaying a constant identity disclosure. Electronic charge cards and commercial “loyalty’ cards, build a profile of where we shop and what we purchase. Biometric identification technology will be able to more fully shape our shopping experience. And then, of course, there is the “everyone is a suspect” rationale that compels personal disclosure, to an overtly disrespectful extent, at airports, public buildings and some commercial environments.

    By and large, we have become complicit in all of the foregoing and in the intentional and unwitting disclosures we make to social networks.

    There are two broad ways to control or limit such disclosure: a) opt out of situations that require disclosure and aggregate personal information; b) government regulation. It is actually very difficult for individuals to opt out, short of going into the wilderness and living as a hermit. As far as regulation is concerned, the government has done little to regulate the capture and use of personal information. In part, that is because the government is decades behind in the formulation of important public policy. By the time, it gets around to it, all the personal cats are out of the bag. In addition, particularly with the so-called war on terror, the government is one of the prime offenders – all in the name of securing liberty! Of course, in part, government inaction has also been prevented by governmental and corporate lobbyists with a different agenda.

    One of the civil liberties organizations ought come up with one or more form notices, of general application, that would be posted on the internet. People would then be able to publicly subscribe to one or more of such notices, putting the world on notice that the reservations contained in the form are a part of any of that person’s formal or informal contractual relationships or disclosures. The purpose would be to push the small print burden back where it belongs, onto the commercial and governmental information gatherers.

    Let’s use the internet’s power to throw a roadblock in the path of the exploitation described by the book.

  8. (I thought of Jonah Lehrer’s How We Decide, which describes an experiment in which subjects who were asked to explain why they liked certain pictures made better choices than ones who weren’t asked to explain their preferences).

    You remember it backwards. The people who were asked to explain their choices made WORSE choices than the ones who went on instinct, because that which makes art good cannot be easily described.

    1. Actually, you are both conflating two different studies with two different results. The posters study had to do with the ability to revise decisions (Gilbert 2002). People who were allowed to change their mind about a choice they had made ended up less satisfied with their final choice. Posters were used in that study, but it didn’t have to do with introspection. On the other hand, the fact that introspection tends to lead one to dislike a particular choice was explored in Wilson 1991. There were no posters, but people were asked to choose between some products (like jams) and those that were asked to explain their choices agreed less with experts choices. However, and importantly, in neither of those papers is a causal explanation given. These are correlational studies and nowhere does anyone justify the idea that the inability to describe art means that you will like it more.

    2. You’re right! That’s what I meant to type, but my fingers weren’t cooperating. Thanks!

  9. M.T. Anderson predicted this process in The Feed.

    Shortcomings of the search/personalization process were a significant contributing factor to the death of the deuteragonist.

  10. funny, I just had a similar epiphany myself after seeing autocomplete return ‘facebook’ from ‘F’ for the millionth time. It does seem a litle patronising now I think about it, like a librarian who tells you what books will suit you.

    1. No. It’s equivalent to walking into a library, saying “I’m looking for a book whose title starts with ‘F’…” and the librarian interrupting you and responding, “Wait, let me guess: you’re the gazillionth person to walk in here today looking for that Facebook book, right?” They may be wrong, but they took a reasonably educated guess; they’re not telling you what to read, nor do they have a vested interest in what you choose to read.

  11. jason scott of textfiles fame gave a talk about video editing at Notacon a few years back. this was the first time i considered the idea of all edits being editorial in nature.

    this is why source materials in journalism are such a big deal, and why outfits like wikileaks are important. when the source material is there, along with the editorial, we are all able to draw our own conclusions.

    i watched the collateral murder video and drew the conclusion that it was a tragic mistake. if there was more of this sort of journalism, maybe bumper sticker politics wouldn’t be the disease that it is.

  12. I think that we have now more opportunities to discover new things and opinions then ever before – if we want to. Content filters are in place since the begining of time, enforced by the lack of access to information, edited through media, politicians, interest groups, family, …

    But I think that people are more aware that there is something else beyond their own bubble, their own music collection and book shelf, their own friends on facebook and the netflix queue.

    This very book is an example of how we are more in the know, that filtering happens – or is this book only for the ones in the “internet culture” bubble?

    What we really need is more access to the filtering mechanism on the internet to tweak the algorithm. Sometimes I want more filtering, sometimes less and sometimes and I want to start from scratch.

  13. The Internet was customized long before we ever put software in place to do it; the difference was, we did it to ourselves.

    Think of how many websites you regularly visit every day. It’s really not all that lot. You’re limiting your experience of the internet just by virtue of not having unlimited time to devote to it. I’m willing to bet that the vast majority of people visit less than twenty websites on a regular basis for personal recreational browsing.

    You don’t need a customized Internet to give you cognitive blindspots with respect to political or social ideas; all you need to do is spend more time on one forum or newsgroup or blog than you do on others.

  14. This is why I typically read mostly left-wing nutter and right-wing nutter sites at the same time.

    They’re more likely to use primary sources and do original research as they’re very anxious to prove the other side wrong – One side’s orthodoxy is something the other side wants to disprove with mountains of evidence and research.

    Of course, both sides’ sources are biased but this way you at least get a full perspective on things. The mainstream media is just the carefully edited corpo-government perspective.

    1. Score another predictive hit for David Brin, from his 1989 novel Earth

      I particularly liked Brin’s solution of building a randomness option into the filters.

  15. I hear what you are saying and I agree… however, in the days before Google, when you got a randomized list of four or five thousand search results, it was pretty hard to get anywhere. It would be nice if we could choose at any one time, whether we wanted filtering or not.

  16. An answer to this: the selection engine that controls what you see not only gives you what you want, but also calculates what you need to see.

    Becoming a right wing GW denier? Here’s a neutral, science-based article on climate change that will gently push you to the middle, but not dismiss it automatically as liberal rhetoric.

    Only watch bro comedies? Here’s a suggestion for “What Women Want” to ease you into romantic comedies without making you roll your eyes.

    In the end, that seems a bit too Orwellian. But what’s the alternative in this situation? Continued division?

  17. God i hate myself (and i’m not even having religious sex!)for being the kind of pedant who would point this out, but: Pasteur’s assistant (Julius Petri) gave his name to the Petri dish – But it was Alexander Fleming (and his assistants) who left Petri dishes out by a window (in London- never a smart move) and then *discovered* penicillin. /end of pedantry.

  18. Pournelle and Niven wrote about this in the ’80s. Old story.

    In OATH OF FEALTY, the exec running Todos Santos told his computer how much random news (which would otherwise be filtered out) he wanted to see.

    1. In OATH OF FEALTY, the exec running Todos Santos told his computer how much random news (which would otherwise be filtered out) he wanted to see.

      I’d totally forgotten about that! You have a helluva memory.

    1. Saw Eli Pariser (the author) a little while ago with a very interesting TED talk, here’s the video:

      I like that he recommended increasing user awareness of the rules by which the filter operates. I think that’s much better than creating politically self-censoring algorithms. The latter, as Keneke noted, would be somewhat Orwellian, but Pariser’s suggestion would empower the users. In the end, of course, there’s no way to make individuals step outside their comfort zone and pay attention to alternative information sources, but at least they can be given the ability to do so if they have the will.

      Thanks for the great TED link.

      1. In the end, of course, there’s no way to make individuals step outside their comfort zone and pay attention to alternative information sources, but at least they can be given the ability to do so if they have the will.

        Down the road (or maybe now?) hacktivists will root people’s computers but don’t do any harm except covertly steer the unwitting “victim” to alternative sources of information. Alter search results from their preferred search engine, etc. with some various DNS tricks, etc.

        Greyhat botnets trying to save the world one PC user at a time with info dissemination.

        Shit… I’d like to get more of the population to see this, that’s for sure. Enough people might actually get together and stop these bastards, who knows?

  19. Does this bring us back to the word of mouth paradigm..? Sure, filters can exclude and include data for me but, I do not live in a vacuum. I have friends and family and their filters are not going to be the same as mine. I never found bOING bOING from a search engine. A buddy told me that I should waste some time here, from time to time.

  20. I feel obligated to point out that there is no transparency with the Google’s or anyone’s Editorial Algorithm. I know to some this might be a non to minor issue, however it is important to note that since there is no outside “regulation” (as in there is no independent verification on the fundamental soundness of the information being provided) in Cyberspace we effectively crutch ourselves to the results of our search. After all we, “don’t know what we don’t know.” The only other method in which to learn about something is to do so outside the influence of the internet (e.g. TV/ Marketing, or Research).

    So in the end a user’s personal curiosity isn’t the issue. We will never be able to make the internet function like a street. By design we will never be able to get more then a straw into the massive information slushy that is the internet. All Search Algorithms are simply a way to get you closer to the flavor you want at the time. And it is in this that we start to see a paradox take shape.

    With no independent or reliable checks on veracity, and money influenced algorithms the internet finds itself in a precarious place. If anything such a “free” design hobbles itself in the long run. Because of such doubts we are now more reliant then ever on offline verifications of veracity. Things really haven’t changed much since the telegraph. Only now everybody’s got one (albeit a much more powerful one) and everybody’s got some degree of the truth. Well whose right? We all are on some level, right?

    So in the end the truth isn’t a point but rather a plane, we have become engulfed by it and have simultaneously lost sight of it. Maybe what you want is one specific part or fact of that truth, yet you can’t find it since the search algorithms only take you to the plane and not the grid square you happen seek.

    Search algorithms are nothing short of innovative, the amount of money vested in Google should tell you that, yet I still think there is some system that could improve upon it, and I am not sure that personalization is the way to go. Somehow I think transparency is a good place to start, this isn’t meatspace.

  21. I can’t handle the volume of information coming at me since the internet has been in my realm, and my info intake has always been skewed because of who I associate with and what I choose to read. So filters and editors are expected and mostly not unwelcome… A lot of times now when I hear a reference to something I don’t recognize, I don’t look it up because I can’t spare any more brain space.

  22. Cory, thanks for bringing this to our attention and for your thoughtful parsing. Seems to me that unfiltered internet access is similar to going to the library stacks instead of using an electronic catalog. I first learned the word “serendipity” when I asked my mom what it meant to find things you weren’t looking for when browsing the stacks.

  23. This is a very valid fear. Sure, there have always been filters and gatekeepers, but the constraints on our capacity to aggregate and analyse information, which we had in the past, made those filters and gatekeepers not very good. Despite their best efforts, unexpected, left-of-field, seemingly irrelevant information always managed to slip through. Today’s filters, on the other hand, have simultaneously become less visible and more effective, and this will increasingly lead us to (unintentionally) negative consequences. I do not wish to say that we need to structure and organisation in the present state of information overload, but we need to be regularly exposed not only to the relevant, but also to the challenging, as well as the downright random.

  24. After about 15 years spent surfing the web obsessed with anonimity and granting myself a pure unbiased experience online, I recently realized that if most of the people are biased by search engine personalization and similar things I am the one with limited view and perception on the cloud. Those who give strong usefull information to search engine and that use alot of personalization are also the ones who are training the software-database-machine-metasomthing part of internet. So, as once Bruce Sterling reminded about the possibility of a world where you have tags on evry thing: < >

    1. […]
      So, as once Bruce Sterling reminded about the possibility of a world where you have tags on evry thing:
      “you can’t have the good without the bad”

  25. i found it impossible to read the first paragraph and not think about FOX news and how the right has so radically changed over the last few years.

  26. I totally have this problem with my RSS feed aggregator. I used to read everything that came through. Now I have too many sites listed and read whatever’s on top, in my spare time. Google reader thinks that those are what I want to read, and so puts them at the top… then I read those more often and silly ole google becomes even more convinced they are my favorites…

  27. I thought of Jonah Lehrer’s How We Decide, which describes an experiment in which subjects who were asked to explain why they liked certain pictures made worse choices than ones who weren’t asked to explain their preferences

    Isn’t that highly subjective, though. And I don’t just mean that case. One person’s bias is another’s beliefs.

    I agree with yours and Pariser’s other main point though, that self-selection can become reinforcing to the point that it blinds you to salient information. And definitely indexical tools like search engines aid that process, and can’t really avoid doing so by their very nature. Their very function is to sort a mountain of data no user could made headway on manually. They have to do that somehow, and anyway they do it will cause some results to rise to the top.

    It seems to me there is really only one solution and that is to not become complacent, to go outside your comfort zone. This problem isn’t really new. It’s just that, like so many trends, it’s been accelerated by the internet.

  28. I heard Pariser discussing this on NPR, particularly how things like the “like” button prevent people from circulating unpleasant or challenging news stories. (No one wants to “like” an article on genocide, but they might “recommend it” if the button were thus labeled.) Of course, the terminology is driven by advertisers, not by any larger social good.

    Also, I wonder if Pariser applies any of this critique to his own organization. MoveOn has definitely become a kind of one-stop site for online liberal activism. The organization continues to drive liberal readers towards the center and thus to the issues and talking points of the Democratic Party. In doing so, MoveOn helps to narrow liberal/left discourse towards an invisible consensus, that sounds very similar to the corrosive effects of “personalization.”

  29. I disagree with one particular point made here: that the ‘bubble’ is merely a potential problem.

    I have been using google from the very first year it was made available, which is well before they moved to

    In the time since, I have been noticing a gradual trend to move away from true relevance and towards relevant advertising.

    There is a certain quality of information that I am used to being able to find, which is lately becoming more difficult to locate. Instead, I find my searches pointing to shopping sites or inanely-written articles surrounded by advertising.

    I have a hobby of asking questions that have no particular answer, so that I might discover answers that would never have occurred to me or to anyone in my local economy. I depend on accessing the net with as few customizations as possible.

    As far as I am concerned, each layer of filtering only reduces the overall quality of the depth theoretically available.

  30. Eli, and most everyone else, is missing the point.

    Personalization isn’t about creating filter bubbles. It’s about discovery, which has three main components: relevance, diversity and serendipity. Discovery allows you to find things you didn’t even know existed. The problem is, most personalization is really customization or it focuses solely on relevance. And if personalization is simply using what you recently viewed or purchased, you can see a very narrow slice of the world.

    Very, very few people get this and even fewer have the skillset to create recommender systems, but hopefully that will change soon.

Comments are closed.