Thursday, November 09, 2006

Marissa Mayer at Web 2.0

Google VP Marissa Mayer just spoke at the Web 2.0 Conference and offered tidbits on what Google has learned about speed, the user experience, and user satisfaction.

Marissa started with a story about a user test they did. They asked a group of Google searchers how many search results they wanted to see. Users asked for more, more than the ten results Google normally shows. More is more, they said.

So, Marissa ran an experiment where Google increased the number of search results to thirty. Traffic and revenue from Google searchers in the experimental group dropped by 20%.

Ouch. Why? Why, when users had asked for this, did they seem to hate it?

After a bit of looking, Marissa explained that they found an uncontrolled variable. The page with 10 results took .4 seconds to generate. The page with 30 results took .9 seconds.

Half a second delay caused a 20% drop in traffic. Half a second delay killed user satisfaction.

This conclusion may be surprising -- people notice a half second delay? -- but we had a similar experience at Amazon.com. In A/B tests, we tried delaying the page in increments of 100 milliseconds and found that even very small delays would result in substantial and costly drops in revenue.

Being fast really matters. As Marissa said in her talk, "Users really respond to speed."

Marissa went on to describe how they rolled out a new version of Google Maps that was lighter (in page size) and rendered much faster. Google Maps immediately saw substantial boosts in traffic and usage.

The lesson, Marissa said, is that speed matters. People do not like to wait. Do not make them.

37 comments:

Anonymous said...

Of course, speed matters. People don't want to be wasting too much time when surfing around at work, you see. :)

Anonymous said...

Tehre are some additional factors beyond customer satisfaction.

In the search example users recieved 3 times the information per request. This takes more time for them to read and increases the odds of them resolving their search.

In the maps example people tend to click as fast as the map loads. Faster load = more clicks.

Anonymous said...

Traffic and revenue from Google searchers in the experimental group dropped by 20%. Why, when users had asked for this, did they seem to hate it?

Since when does lower traffic and revenue FOR Google equal less satisfied customers? Did she mention anything about how satisfied the users actually were with the results?

If Google were omniscient and somehow returned exactly what I wanted *every* time I search, my traffic on Google would drop because I would be doing less searches. It seems like Marissa would interpret that to mean I'm not satisfied with Google's results, when in actuality, the results were ideal.

I guess this is the strange game you play when your company makes more money when people can't find what they are looking for as easily (and then click on ads).

Anonymous said...

> If Google were omniscient and somehow returned exactly what I wanted *every* time I search, my traffic on Google would drop because I would be doing less searches. It seems like Marissa would interpret that to mean I'm not satisfied with Google's results, when in actuality, the results were ideal.

I did a small test: I ran a Google search. 10 results, three ads. I than adjusted my preferences; 50 results, three ads. Speed difference? Imperceptible. Satisfaction? Much higher.

I rarely get what I want in 10 results, and having to wait for the next page is annoying. More results per page, with relatively fewer ads, is a win for me, but could reduce ad revenue.

Sites, such as Amazon, that seem to default to too few results per page make me think that the owner is trying to maximize ad exposure while not pissing me off so much that I'll leave.

Speed matter most when there are no other factors driving loyalty. But when a site offers quality, I'll be far more patient.

It may make sense for Google to default to 10 results, if the idea is to quickly impress new users enough to make them want to come back. The option to increase the number of results is good because, on average, it saves time.

Anonymous said...

I have to agree with the previous two comments. If traffic dropped in the experimental group, it would stand to reason that searchers were finding what they were looking for faster. And were therefore more satisfied.

Hey, if I was Google, I'd be like all, "More money for us = more satisfaction", too. But a good example of how speed is important, customer satisfaction rules, and so forth - this is not.

Greg Linden said...

I think there may be confusion about what Marissa meant by "traffic".

I don't think it was extremely clear from what she said during her Web 2.0 talk, but my impression is that she meant that searches decreased by 20% (not counting transitioning between pages of search results).

Number of searches seems like a pretty good measure of user satisfaction. It would might be an even better measure if obvious refinements of a search (e.g. "web 2.0 conference" following "web 2.0 summit") were counted as one search.

I do agree that if she meant page views dropped by 20%, that is not obviously a good measure of user satisfaction.

But, I think by "traffic" Marissa meant searches. Google often appears to use searches, not page views, as a measure of traffic, and commonly talks about revenue per search, not revenue per page view.

Craig Danuloff said...

There's another possible reason, one that is probably more relevant: too much choice. The book 'Paradox of Choice' shows that the more choices you give people, the less likely they are to make ANY decision. Fascinating read with implications all over the place - including for web design.

Greg Linden said...

Great point, Craig, on the Paradox of Choice. I was thinking that as well and started to mention it in the post, but it ended up confusing the main point about load times, so I left it out.

But, yes, I suspect people may claim they want more choices and more information in their search results but, when they actually get more choices and more information, feel overwhelmed and confused.

It would be interesting to eliminate the difference in page load times and then test again. That might allow them to see if there is also a paradox of choice issue here.

Anonymous said...

I would like more detail on this test. A couple of things are bothering me about attributing the reduction in traffic and profit to page load alone:

1) If you display more results per page a user is more likely to find what they are looking for on a single page request. (How often have you gotten to the 3rd page of SERPs without finding at least one link relevant to what you're looking for?) Less page requests needed == less traffic. Someone pointed out above that she may have meant distinct searches. That too can be explained by offering more results. If I don't find what I want on the first page I often refine my search, thus generating another distinct query. If I have 30 results I'm more likely to find what I want on one page without changing my search terms.

2) Google's revenues largely come from adwords which display towards the top of the page. If you increase the page length than a user will end up scrolling down past the ads, thus no profit.

Anonymous said...

How stupid can this test be?

User got 10 results in .4 seconds

compared to:

User got 30 results in .9 seconds.

Anybody spots the obvious error here? In order for the first user to get 30 results,he needs 0.4*3=1.2 seconds if all downloads are running 100% ok.

Of course if you hit the user with more ads he will act accordingly; but what has this to do with usability?

I, like maybe 95% geeks out there, set the number of search results to 100 as soon as I get my hands on a new machine. If google ever drops this feature out, it's bye-bye for me

Anonymous said...

Gee.. sounds like real conclusive evidence there.

Three major variables changed (speed, and number of results, ratio of ads to search results), not to mention the layout of the page grew to three times the length and she only focusses on the speed as the reason?? I thought employees of Google are supposed to be smarter than that!

Anonymous said...

commenting on an ancient post:

My perception may be jilted, but I generally don't make it all the way down to the 10th result. I prefer to click on one of the first 5 or so links. Whatever's on my screen, really. I can usually tell by reading the first couple results whether or not I'm going to find what I'm looking for easily. I do often refine my searches many times, but continue to only look at the first few.

Len Bullard said...

IBM did studies on interactive interfaces in the 80s. The results showed that as the return time neared two seconds, the time the user spent staring at the screen increased. The curve turned up dramatically below two seconds. Below one second, someone had to pry them off the keyboard. And that was with non-rich terminal applications.

You don't have to go far in behavioral science to figure this one out. I wouldn't geek over it too much. Faster is better but it cuts both ways. It is just attention as a product of stimulus/response. Faster torture amplifies the revulsion too. In the balancing act, slower but better search results are still better than faster nonsense.

Len Bullard said...

More choices: that is Shannon's classic signal/noise. Information is that which reduces choice. The more choices, the less information.

No new news in Mayer's comment or the Paradox of Choice. However, it means that any improvement in the means or skills for selecting search terms reduces the choice by increasing the available information. Google should have tried building a few training games into Lively or something like it that sharpens the ability of the user to select the initial search terms.

I seldom need more than ten hits unless I am intentionally widening for 'creative search' where serendipity is used to create something new from the results (a favorite trick for using YouTube when looking for a vid to embed in a blog topic). IOW, I weaken the vector and look for outliers.

Anonymous said...

I think it is a common mistake to conclude that pages with AJAX or Flash are slower. By slower people must mean that the single page takes longer to load than say, an average page or a similar looking page.

Ajax and Flash displace the "page" metaphor and this is what most critics do not seem to grasp. There is no page when AJAX or Flash are present. So you can't compare a page without such technologies and one with. It would be like comparing apples to oranges. Like comparing a sedan to a bus and concluding that the sedan is faster.

For the comparison to be fair, the AJAX page would have to be compared to the SET of pages that it replaces. Without AJAX or Flash, accessing that information would require loading, sequentially, a succession of pages. The compound load time of those single pages is what Flash and AJAX replace, not the single page. Single pages DO NOT NEED asynchronous technologies, because they are, umm, single.

Because both Flash and AJAX avoid redundant loading by loading upfront and reloading only new data. And because they can effectively manage user experience during load times, something HTML can't do, they can actually minimize the compound load time and, more importantly, the PERCEIVED load time, when properly designed and implemented.

Mike P. said...

Research from the 80s and the PLATO Project showed that response time really needs to be about .25 second. For something where users expect a bit of work by the computer, stretching that to .4 I guess is OK. But, I would shoot for .25sec.

Mike

callingshotgun said...

@Calin-

The math is wrong: You're assuming that search results are the only thing responsible for the page load- Ignoring graphics, other text (links, whatever), tracking code that may be on the page, etc.

Let's say "x" = the time required for a unit of 10 searches, processed and displayed (along with any adsense related to the results)
"c" will be the time taken to load everything else on the page not related to searches.
From what Marissa said:
c + x = .4
c + 3x = .9

3c + 3x = 1.2
- (c + 3x) = .9 gives us
----------------
2c = .3
c = .15

x + .15 = .4
so x (unit for 10 searches) = .25 seconds

Which, now that I"ve done that, could have been figured out just as easily by saying "20 more searches adds .5 seconds, so 10 searches adds .25 seconds, so the rest of the page takes .15 seconds".

Anyway, just an FYI, it's reasonable to assume they didn't screw up the math:P

Unknown said...

As others have said, speed certainly matters, but I too am dubious about the Google example.

Of course, I also use Greasemonkey to get an "infinite scrolling" Google search page- as I scroll down, Greasemonkey is fetching the next page of results and displaying them at the bottom of the current set, so that I just never hit the bottom of the page.

Unknown said...

I disagree with Isaac Rivera. I have often given up on flash sites because they take too long to load initially. I would prefer some lost time in between pageviews.

Anonymous said...

Traffic and revenue may have dropped by 20%, but wouldn't that just be because there aren't 3x the ads on a 30 result page and you make fewer requests by not clicking through as many pages of results?

Anonymous said...

How about if they tried new searching algorithm X which takes 10% longer but gives 20% better results - would that lead to "lower satisfaction too", because as someone else mentioned, less searches required does not mean less satisfaction. If you don't get a result on page 1, you'll do one of two things: search for something more specific, so that your original search of "test page" becomes "test page -rabbits +ducks", or go on to page 2. Removing the need for page 2 AND page 3 by raising search result counts to 30 eliminates the need to do the second option, and increases the probability of results that a more specific search would yield showing up.

I have google set to give me 100 results at a time, and by the time I get through the first 10 the next 90 will have loaded.

This test doesn't sound at all representative, there are too many other things that could affect this.

Anonymous said...

I usually scroll down the first page and if I don't get what I'm looking for I refine my search and try again. With more front page results its more likely I will find it on the front page and less likely i will search again. That's what this experiment would show if i was in it, maybe other people do the same.

likeem said...

This makes PERFECTLY SENSE.

It's not about how much time are spent in total, but rather how much are the cognitive process of surfers disrupted between the request and response.

If there's enough of a delay that the users actually NOTICE the delay, there will be A LOT LESS customer satisfaction, than when it's "imperceptibly" small. It's not linear, because there's a lower treshold of about 0.2 seconds before any delay are noticed at all. to 2x the perceptible delay or 4,5x the perceptible delay?

Then there's distractions. If a user is interrupted, there will be enough time to start other cognitive proesses that takes priority, over the original task. Zero opportunities for loss of interest BEFORE the results are presented are also HUGE.

This matters a LOT more than what "makes sense", but that's the way the brain works.

Also, when companies the size of Google/Amazon are willing to share something, they've got both sufficient sample size & methology to be certain. Disregard this if you have better facts.. not because you've got some compulsive need to object.

vincenzio said...

re: Isaac Rivera's comments: Of course the sedan is faster than the bus! Is it an unfair comparison? No one cares if it is, when the desire is to get there faster, and all other requirements take the back seat.

The rewards of flash and ajax aren't there if the site/page never gets viewed. I know if I navigate to a flash based site, I either click the html link or leave. I don't wait for whatever nonsense the 'designer' has chosen to bless me with.

Anonymous said...

Sure, the google test is easy to pick apart and discount, but the interesting part of this article for me is the amazon test.

"we tried delaying the page in increments of 100 milliseconds and found that even very small delays would result in substantial and costly drops in revenue"

They only altered speed. They didn't alter the number of results or add/result ratio. It shows a correlation between speed and revenue much better than the google test.

I can already picture some middle manager demanding speed increases at the cost of result quality... hooray for metrics.

Anonymous said...

Yeah, and decapitating your customer is a good way to lose repeat business. It's great to see solid research lend support to obvious. Really!

Unknown said...

Tangental support for this argument:

There were 2 (by my count, maybe I missed some) 1/2 second ads run locally in this years superbowl.

Like the blink of an eye, but definitely noticeable.

Anonymous said...

Why do you make me wait for the end result of your post? Why not just say, "Fast=good, slow=suck. NO MATTER WHAT YOU THINK YOU WANT PEON." If you are going to praise brevity for the sake of expedience at least live by that standard.

Robert Robbins said...

What? Are you saying Amazon.com has ever been concerned about how fast their web site loads? That has got to be the most bloated site on the Internet except for eBay. I have to keep Task Manager open and kill at least 6 processes just to give my browser enough CPU to open that resource pig.

Anonymous said...

hah - yea not very scientific. I'm sure its because users are finding what they want now on the 2 total page st page instead of 4 page views.

Anonymous said...

Does anyone know where I can watch this talk as I found a Web 2.0 Marissa Mayer video on you tube and none of this is mentioned in it?

Greg Linden said...

Which Web 2.0 talk did you find, Anonymous? There have been multiple Web 2.0 conferences and Marissa has spoken at many of them.

As you can see from the date on the post, the one I was referring to is the 2006 Web 2.0 conference.

Anonymous said...

Greg can you post a link here too the video where Marissa talks about load page time at the conference, I can't find it either.

Greg Linden said...

I don't know if Marissa's Web 2.0 talk was recorded. If you find a video of it, please do post a link to it in the comments here, but I don't know of one.

Greg Linden said...

By the way, if you are not actually looking for the video but merely confirmation that Marissa actually said this, not only are there other reports on Marissa's 2006 Web 2.0 talk (e.g. [1] [2] [3] [4] [5] [6]), but also Marissa has said similar things in later talks (e.g. her "Scaling Google for the Everyday User" talk at the Seattle Scalability Conference in 2007, see around 13:00 in the video).

jeremy said...

FWIW, it's now three years later (2009 rather than 2006) and Google's new "Fast Flip" search interface shows 30 results to the page, rather than 10.

Interesting development.

Anonymous said...

The new Google Maps? Well the latest version is an overbloated time hog that is processor crippling, from my experience. (with the dangling overlay parts)