Google Fellow Jeff Dean gave an excellent keynote talk at the recent WSDM 2009 conference that had tidbits on Google I had not heard before. Particularly impressive is Google's attention to detail on performance and their agility in deployment over the last decade.
Jeff gave several examples of how Google has grown from 1999 to 2009. They have x1000 the number of queries now. They have x1000 the processing power (# machines * speed of the machines). They went from query latency normally under 1000ms to normally under 200ms. And, they dropped the update latency by a factor of x10000, going from months to detect a changed web page and update their search results to just minutes.
The last of those is very impressive. Google now detects many web page changes nearly immediately, computes an approximation of the static rank of that page, and rolls out an index update. For many pages, search results now change within minutes of the page changing. There are several hard problems there -- frequency and importance of recrawling, fast approximations to PageRank, and an architecture that allows rapid updates to the index -- that they appear to have solved.
Their performance gains are also impressive, now serving pages in under 200ms. Jeff credited the vast majority of that to their switch to holding indexes completely in memory a few years back. While that now means that a thousand machines need to handle each query rather than just a couple dozen, Jeff said it is worth it to make searchers see search results nearly instantaneously.
The attention to detail at Google is remarkable. Jeff gleefully described the various index compression techniques they created and used over the years. He talked about how they finally settling on a format that grouped four delta of positions together in order to minimize the number of shift operations needed during decompression. Jeff said they paid attention to where their data was laid out on disk, keeping the data they needed to stream over quickly always on the faster outer edge of the disk, leaving the inside for cold data or short reads. They wrote their own recovery for errors with non-parity memory. They wrote their own disk scheduler. They repeatedly modified the Linux kernel to meet their needs. They designed their own servers with no cases, then switched to more standard off-the-rack servers, and now are back to custom servers with no cases again.
Google's agility is impressive. Jeff said they rolled out seven major rearchitecture efforts in ten years. These changes often would involve completely different index formats or totally new storage systems such as GFS and BigTable. In all of these rollouts, Google always could and sometimes did immediately rollback if something went wrong. In some of these rollouts, they went as far as to have a new datacenter running the new code, an old datacenter running the old, and switch traffic between datacenters. Day to day, searchers constantly were experiencing much smaller changes in experiments and testing of new code. Google does all of this quickly and quietly, without searchers noticing anything has changed.
The raw computational power is staggering already -- thousands of machines for a single request -- but what is to come seems nearly unbelievable. Jeff said that Google's machine translation models use a million lookups in a multi-terabyte model just to translate one sentence. Jeff followed by saying that Google's goal is to make all information in all languages accessible regardless of which language you choose to speak. The amount of processing required is difficult to fathom, yet it seems the kind of computational mountain that might cause others to falter calls out to Googlers.
In all, a great talk and a great start to the WSDM 2009 conference. If you want to catch Jeff's talk yourself, and I highly recommend you do, the good people at videolectures.net were filming it. With any luck, the video should be available on their site in a few weeks.
In addition to this post, you might enjoy Michael Bendersky's notes on Jeff Dean's talk. It appears Michael took about as detailed notes as I did.
Update: Jeff now has made his slides (PDF) from his talk available.
Update: Jeff Dean's talk is now available on videolectures.net along with many of the other talks from WSDM 2009.
Subscribe to:
Post Comments (Atom)
11 comments:
Hi Greg -- thanks for the synopsis. Sounds like a terrific talk.
take care,
Jon
The coolest thing that google has rolled out recently is displaying semantic data in searches. Do a search for a specific piece of information (i.e. Height of Michael Jordan) and the answer is displayed in the first result with the source (usually wikipedia, although other random webpages are shown too). Google is understanding what you are asking for and displaying just that. I'd be curious as to how google is doing this. It just seems so cool.
thank you for the wrap up, thats impressive numbers.
As a google user i would have to agree that you feel the searches are faster and as a webmaster the increased index updates are much more useful.
This kind of innovation, scalability and flexibility is a function of the way Google has acquired talent over the last decade. Would be interesting to see some kind of a formula on these metrics and their relation to the pedigree and competency of their engineers.
This is all fascinating, it really is, but when all is said and done, when I search with snippets of song lyrics trying to seek the lyrics of the original song and artist, the top results invariably offer no lyrics at all, and only ringtones.
Google is still being "SEO'd".
Yep, Google has done impressive progresses. Altough, I would like to know if Yahoo! Research are going to keep that pace... or at least at which level are they running now.
In fact, WSDM'09 is being held at Barcelona, home of Yahoo! Research, headed by Prof. Baeza Yates, one of the eminences of IR community.
I'm hoping Yahoo! to present great work too. Really
@Sid:
Funny you should mention that query.
When I do a search on Google for
"Height of Michael Jordan". The result is "Michael Jordan — Height: 6 FT 2 in (1.88 M)".
Clever, except that you probably meant the basketball player, who is actually 6 ft 6 in. Google gave you the result for the football/soccer player.
Cleverness can be misleading.
Both Yahoo and Live Search give the right answer in normal query results (to be fair, Google does too, but in the rush to be clever, they are more likely to mislead you than the others).
This sort of thing explains why MS is unlikely to ever catch up. Let's see MS do 10% of that in the same amount of time with the same amount of engineers.
None of the ideas is very unique or revolutionary, except the large scale usage of GFS and MapReduce. Mike Burrows did much of search architecture design in altavista way before Google.
The slides are now available. Slide 33 made my day :))
http://research.google.com/people/jeff/index.html
Adinel
Thanks, Adinel!
Post a Comment