Wednesday, May 18, 2011

Eli Pariser is wrong

In recent interviews and in his new book, "The Filter Bubble", Eli Pariser claims that personalization limits serendipity and discovery.

For example, in one interview, Eli says, "Basically, instead of doing what great media does, which is push us out of our comfort zone at times and show us things that we wouldn't expect to like, wouldn't expect to want to see, [personalization is] showing us sort of this very narrowly constructed zone of what is most relevant to you." In another, he claims, personalization creates a "distorted view of the world. Hearing your own views and ideas reflected back is comfortable, but it can lead to really bad decisions--you need to see the whole picture to make good decisions."

Eli has a fundamental misunderstanding of what personalization is, leading him to the wrong conclusion. The goal of personalization and recommendations is discovery. Recommendations help people find things they would have difficulty finding on their own.

If you know about something already, you use search to find it. If you don't know something exists, you can't search for it. And that is where recommendations and personalization come in. Recommendations and personalization enhance serendipity by surfacing useful things you might not know about.

That is the goal of Amazon's product recommendations, to help you discover things you did not know about in Amazon's store. It is like a knowledgeable clerk who walks you through the store, highlighting things you didn't know about, helping you find new things you might enjoy. Recommendations enhance discovery and provide serendipity.

It was also the goal of Findory's news recommendations. Findory explicitly sought out news you would not know about, news from a variety of viewpoints. In fact, one of the most common customer service complaints at Findory was that there was too much diversity of views, that people wanted to eliminate viewpoints that they disagreed with, viewpoints that pushed them out of their comfort zone.

Eli's confusion about personalization comes from a misunderstanding of its purpose. He talks about personalization as narrowing and filtering. But that is not what personalization does. Personalization seeks to enhance discovery, to help you find novel and interesting things. It does not seek to just show you the same things you could have found on your own.

Eli's proposed solution is more control. But, as Eli himself says, control is part of the problem: "People have always sought [out] news that fits their own views." Personalization and recommendations work to expand this bubble that people try to put themselves it, to help them see news they would not look at on their own.

Recommendations and personalization exist to enhance discovery. They improve serendipity. If you just want people to find things they already know about, use search or let them filter things themselves. If you want people to discover new things, use recommendations and personalization.

Update: Eli Pariser says he will respond to my critique. I will link to it when he does.

25 comments:

Daniel Lemire said...

I have great respect for Amazon's recommender system (obviously) but I do think that we are still falling short of coping with the long tail.

For example, I am a passionate scifi reader. Go on Amazon and look up a few books from the major authors. Amazon will then point out to the other major authors. There are maybe 20 such authors. Once you have discovered them all, you get into some kind of "recommender" bubble or clique. There are great (and even somewhat popular) authors outside this clique, but they are hard to find through the Amazon web site, probably because they fall below some threshold.

When Google Reader started recommending blogs, I used it quite a bit. Then I noticed that as I added theoretical computer scientists (TCS) to my list, I got into some kind of clique where Google Reader recommended more and more TCS bloggers. At some point, I had to do some major cleaning to escape this clique and get to read other stuff.

jeremy said...

Would it be fair to say that personalization on its own neither expands or filters.. but it is the particular personalization technique, the particular implementation, which may serve to expand or contract one's world view?

I get what you're saying about personalization being about discovery, about finding things of which you previous were not aware. But you can examine that newness through two lenses: Newness of an item vs. newness of a concept.

Suppose for example that I am a moon landing denier. Personalization could do one of two things for me: (1) Bring to my awareness a blogpost about why this particular photograph could have been faked, or these rocks are obviously earth rocks. Or (2) Bring to my awareness blogpost about why the moon landing really did happen.

In both cases, discovery has occurred. Some new piece of information of which I previously was unaware has now entered my consciousness. But one has challenged my worldview; one has not.

I'm saying Findory did or didn't do one of these, I'm simply asking about your abstract, top level understanding of personalization: Could it be said that personalization really is about either of these approaches? And that it is incumbent upon the system designers to bias the system toward one of the two information goals, whether new item from same worldview, or new item from different worldview?

Anonymous said...

What if they are both right? Filtering done superficially based on limited observations of activity might do things badly. However, a deep neurological (and physical) sensing of the subject might enable such precise modeling that all desirable discovery would be anticipated. [I'm not talking about current state of the art, here.]

Anonymous said...

As other commenters have implied: the subject criticism seems oriented toward very naive recommender systems, and certainly fails against more sophisticated personalization.

For example, as I discuss in my book on enterprise recommender and auto-learning systems, "The Learning Layer," it is valuable to provide users with the means to explicitly tune the "narrowness" for their recommendations. But even within this guidance from the user, "the wise system will also sometimes take the user a bit off of her well well-worn paths. Think of it as the system running little experiments. Only by 'taking a jump' with some of these types of experimental recommendations every now and then can the system fine-tune its understanding of the user and get a feel for potential changes in tastes and preferences."

In other words, by taking the user outside his comfort zone, not only is "the long tail" beneficially exposed, but the system gains valuable information not otherwise available that enables more nuanced inferences in the future.

And beyond these capabilities, an explanation should be provided to the user as to why a recommendation was made, and even the degree of confidence the system has in making the recommendation--thereby making it transparent to the user that the recommendation is a bit off the beaten path.

Todd said...

Thanks for a well written article.

In 2004 I wrote software to scan news feeds so I could discover interesting news and articles I previously would have missed. It led me to finding this blog and many other sources I now follow.

Since I implemented my software as a feed reader (feedsanywhere.com) I could control the sources and choose one's that match my viewpoint, but since I can sift through more news than I previously read it gives me the time to search out new sources thus expanding my exposure to various viewpoints.

Also, there's nothing preventing someone from using multiple services to enhance serendipity further. I love consuming RSS feeds, but I still get news from Yahoo and other places.

Greg, your mention of Amazon product recommendations reminds me that I think they've slightly missed the mark with the new Kindle with special offers. I don't care about a book sale, but I would be interested in seeing a new book Amazon thinks I might be interested. I wonder why they don't use recommendations instead of special offers. Discovery is one thing that keeps me coming back to Amazon so why not expand it to a Kindle.

Jurgen Van Gael said...

Hi Greg,

I think you're absolutely right. Say in the news domain: personalization has the opportunity for you to discover new things you didn't know existed. I would argue that the old style "plain paper" newspaper model causes way more single-mindedness by definition: everyone gets the same newspaper!

spider said...

The ubiquitous "personalization" train is just leaving the station and so it is revealing that the President of the Move.On organization has written a book on the subject now. Haven't read the book to understand the societal, commercial and political issues being raised at this early stage of technology introduction.

@jeremy
Agree that it is about the methods/techniques used.

Anonymous said...

On the individual level, I agree with you Greg about the role of personalization for discovery. At the group level, however, Balkinization is potentially a real issue.




I highly recommend you look over Ethan Zuckerman's post around these issues.

Dinesh Vadhia said...

@greg
Weird. My first comment turned up under the name "spider" which I don't own!

Jose San Pedro said...

Hi,

good post and better discussion. My opinion on this matter is that regardless of what the ultimate objetive of personalization is, in some contexts is leading to what Daniel Lemire has very graphically defined as "cliques". Eli himself uses a very interesting example of promoted Facebook friends (see his TED talk about it).

In search contexts personalization seems to be clearly limiting serendipity. For instance, google makes use of features such as user location to personalize results, therefore biasing what the population of entire regions are able to find. As a consequence, users learning about certain topics might find their notion of information relevance affected just by the place they were when the query was written. Desirable in some cases, but not so in many others.

Summing up, I agree that personalization is not at all incompatible with serendipity (and indeed might be the best way to enhance it), but when applied incorrectly it may lead to the exact opposite result.

Todd said...

Jose mentioned Facebook and one of the interesting things about its personalization of the news feed is many Facebook users don't know they're not seeing every post. I prefer to see sites make it clear if the data stream is being filtered.

Daniel Tunkelang said...

Seems a propos to point readers to Vegard Sandvold's 2009 post on "Does Everything Really Sound Like Coldplay?", which in turn references Oscar Celma's dissertation on “Music Recommendation and Discovery In The Long Tail”. Granted, discovery doesn't have to all be in the tail -- I actually discovered Coldplay because Pandora recommended it to me.

Greg Linden said...

Looks like Eli Pariser wrote two new editorials this weekend pushing his view that personalization limits serendipity, one in The New York Times and one in CNN.

Meanwhile, Cory Doctorow criticizes the book as unbalanced.

Greg Linden said...

Several good comments here. Let me add a couple things to the discussion.

Clearly, as many point out, personalization is a tool, and, like any tool, it can be used badly. There is no question that personalization could be used to limit serendipity, intentionally or accidentally. My point is that well implemented personalization will enhance serendipity. And, I would argue, that if a business implemented personalization that limits serendipity, that is inherently unstable, as customers can already use other tools to find what they already know they want (like search or browse), and, if the business or any competing business launches personalization that enhances serendipity, customers will prefer that. For example, at Amazon, A/B tests found that biasing the personalization toward very popular items reduced sales, so that change was not made. People use the recommendations to find things they couldn't find otherwise, and biasing the recommender toward items they already know about like popular items destroyed its value. The recommender was tuned to enhance serendipity not just because that was the right thing to do, but because that was what worked best for customers. Serendipity is not just a design goal for personalization. It the purpose of personalization.

A major issue is that Eli is not distinguishing between customization (aka explicit personalization) and implicit personalization. His argument moves back and forth between automatic filters and people implementing the filters themselves, between the problem being lack of control of automated personalization or people using control of picking sources and feeds in ways that limits what they see. Clearly, it is the case that people can use customization to limit what they see. People already do this whether they use a customization tool like a feed reader or not, just by going to one website or another or picking up one newspaper or another. In this post, I limited my argument only to implicit personalization, like recommender systems, that learn from behavior, but I want to say that I think the lack of clarity in Eli's argument about what he means by personalization adds confusion to this debate.

jeremy said...

Clearly, it is the case that people can use customization to limit what they see. People already do this whether they use a customization tool like a feed reader or not, just by going to one website or another or picking up one newspaper or another.

I was listening to a follow-up interview of Pariser on KQED Forum (May 19th episode), and he actually made a really good point with respect to customization: He said people can't do it. More and more, it is becoming impossible to explicitly turn off a company's implicit personalization, and get absolutely neutral resulst. Take search engines for example. They will use your country domain, your browser type, etc. to give you different results than someone else. Even when you're not logged in to your account.

There is a complete loss of control on the user's part, a complete inability to explicitly turn off that implicit personalization.

The user needs (cue dead horse beating) tools, needs the control back.

It's fine that users are already personalizing. But they have the choice to pick up a different newspaper if they want to. With built-in personalization like the country-code issue, or the browser issue, the user has zero control, zero choice. Is that not a concern?

Greg Linden said...

Hi, Jeremy. I'd say loss of control is a tangential issue, like privacy or data security, if our debate is supposed to be about serendipity.

The argument that additional control would enhance serendipity seems weak. You have to argue that people would use controls not to add additional filters, but to remove them, but evidence seems to indicate that, if anything, the opposite is true (such as with Findory's example where people wanted to limit sources and viewpoints they saw).

Moreover, it isn't clear that loss of control is anything new or that personalization changes it. People have no influence over the ranking algorithm used by Google, personalized or unpersonalized, and their only control is to use another search engine. Likewise, they have no control over what the editorial board of a newspaper chooses as their top stories, and no choice other than to read another newspaper if they want something different.

So, while loss of control, privacy, and data security all concern me, both in personalized systems and non-personalized systems, I don't think they have much bearing on this debate about whether personalization enhances serendipity and discovery.

jeremy said...

Well, I don't see loss of control as tangential. No control means no awareness, and no awareness means no recognition of being in a filter bubble, and no recognition of being in a filter bubble means no ability to escape that bubble.

The situation becomes particularly egregious given that fact that pretty much all the search engines do it.. pretty much everyone implicitly personalizes by country code, if not by other signals (browser, etc) as well.

As a teenager, if I wanted different music recommendations and/or browsing options, I knew that I could get different results by going to the chain record store at the mall over in Fremont, or that I could take BART up to Berkeley and go to an independent (yet still very large) record store like Rasputin's. Both options had scale, but were very different from each other. I had a choice.

Today, I feel like I don't have this choice with web page search/recommendations. Pretty much all the majors are doing the same thing, putting me in this control-less bubble. And there are no real independent, web-scale engines that give me anything different. Where is the Rasputin's of web search?

Dinesh Vadhia said...

Well, I must have been living under a rock because I didn't know that Google personalizes search results. Just did a test from different devices (belonging to different people) using the same query keywords and there were differences in the results. Uhmm, not sure what to make of this late-in-the-day discovery ... But, I can see why Eli Pariser is making a big deal out of this because Google is the goto search site for the majority of the population in most countries.

Sebastian Wain said...

As usual, this is not a black & white discussion: I always admired Amazon but nothing compares with a very close friend who can recommend stuff from a different "galaxy" that no recommendation engine can infer [yet].

Believe me, we, humans, have some uniqueness that can't be inferred from statistics.

And there is another point about the goal of recommendation engines, if Netflix want to increase their profits they will not risk recommending stuff that you can't like with a level of confidence.

Yossi said...

Personalization means satisfying a person's needs, pleasures and desires at a particular circumstance.

Anonymous said...

I think a good example is showing a golf fan from US a story about how a golf fan or player did something unique in Fiji. But I think showing him a news story about the "new trend in woman's shoe" might not benefit or interest him. I think in the older editorially controlled world the total number of stories that could be put were limited but now in the new world it can be expanded and thus add more interest to the user. I think what personalization misses out and which needs to be incorporated is that don;t show a "democrat" only his stories identify that polarity of concepts will be important to him and so show him "republican" view point as well. I guess it is a question of identifying the interest at a high enough level so the interest is "politics" and not "republican" or "democrat" and then making sure that the content shown has enough diversity from across that spectrum.

phraktle said...

Posted my opinion on Eli's views: http://phraktle.com/en/personalization-expands-filter-bubbles

Brief summary: he does make some good points, but those are not about personalization. Personalization actually expands the horizon of discovery.

Jaime said...

I'll admit it: I tested Eli's test. Can you guess the outcome? Of course you can. I have to question whether or not he actually did the test or just made it up.

I've never had this "filter bubble" issue and I don't expect I ever will. I do my searches and I start out vague, then I narrow the results by keyword and search strategy. I see all sorts of ads that may or may not be relevant to me, but my searches always produce exactly what I need (and sometimes sends me way out into the left field with new information based on my keywords, not my search habits).

I like personalization and I like what it offers. I like Google and I rarely complain about their services, because they deliver.

By the way, Eli must have forgotten to respond.

David Amerland said...

Greg you make some very good points here and I agree with most of what you say, on principle. Eli also raises an issue which many do not get and a lot of people do not totally understand. Eli's argument is mostly focused on search and social networks and it comes, understandably, from a layman's point of view and as such he raises awareness on something which many layman may not even be aware about.

Your arguments about personalisation in product presentation (as in Amazon) and search are right but fail to fully take into account the inexperience of laymen with filters and their inability, at times, to fully comprehend the functionality of technologies they use on a daily basis.

Eli brings focus upon the fact that in an environment where search engines try to create a totally personalized experience for a user and eCommerce verticals, such as Amazon, focus on creating profitable conversions, there is a potential danger of creating these 'comfort bubbles' as Eli calls them where our very immediate view of the world is narrowed down (in breadth if not depth) to the point that we truly fail to see the 'Big Picture'.

The glib answer lies, of course, in the education of the 'masses' to use the web and its technologies better but that is neither easy nor reliable. The debate Eli has started is part of an important process for finding an answer.

chas said...

If you keep finding only more and more of what you "want" to find, of what is "useful" to you, eventually it's like the rat that keeps pushing the lever that delivers "pleasure" impulses to the brain and starves to death as a result.

If you are a frog is a pot eating flies fed to you by your chef, what you Need is a feeling of Discomfort. What you want (more and bigger flies) is only going to make for a better feast at your expense. Wake up. The house is on fire.