Bing and Google Agree: Slow Pages Lose Users

Today representatives of Google Search and Microsoft’s Bing teams, Jake Brutlag and Eric Schurman respectively, presented the results of user performance tests at today’s Velocity Conference. The talk was entitled The User and Business Impact of Server Delays, Additional Bytes, and HTTP Chunking in Web Search. These are long-term tests were designed to see what aspects of performance are most important. To know how to improve their sites both Bing and Google need to know what tweaks to page load perceptions and realities help or hurt the user experience. This is one of the first performance tests that has actual data (and is not strictly anecdotal). The numbers may seem small, but they if you are dealing in millions/billions they add up quickly.

Here are Brutlag’s and Schurman’s final points:

  • “Speed matters” is not just lip service
  • Delays under half a second impact business metrics
  • The cost of delay increases over time and persists
  • Use progressive rendering
  • Number of bytes in response is less important than what they are and when they are sent

Server-side Delays Test:

Server-side delays that slow down page delivery can significantly and (more importantly) permanently affect usage by users with the test. Both Bing and and Google ran similar tests that support this claim.

bing server delays

Bing’s test: Bing delayed server response by ranges from 50ms to 2000ms for their control group. You can see the results of the tests above. Though the number may seem small it’s actually large shifts in usage and when applied over millions can be very significant to usage and revenue. The results of the test were so clear that they ended it earlier than originally planned. The metric Time To Click is quite interesting. Notice that as the delays get longer the Time To Click increases at a more extreme rate (1000ms increases by 1900ms). The theory is that the user gets distracted and unengaged in the page. In other words, they’ve lost the user’s full attention and have to get it back.

google delay results

Google’s Test: Google ran a similar experiment for where they tested delays ranging from 50ms – 400ms. The chart above shows the impact that it had on users for the 7 weeks they were in the test. The most interesting thing to note was the continued effect the experiment had on users even after it had ended. Some of the users never recovered — especially those with the greater delay of 400ms. Google tracked the users for an additional 5 weeks (for a total of 12).

(I’ve included more on the other tests after the jump.)

The raw numbers of Google’s Server-Side test:

google server table results

Progressive Rendering

progressive rendering

Progressive rendering can have huge performance and user satisfaction gains. Bing ran a test where they sent down their easy-to-compute-and-serve header before they sent down their results. They used Chunked Transfer Encoding to deliver the bits. This resulted in more user engagement presumably due to the immediate visual response. Hopefully this response will outweigh the dev costs of implementing a more sophisticated application.

The test results:

bing progressive rendering results

Page Weight Increase Test:

page weight increase

Page weight increase had relatively little effect on users. Bing Search conducted a test where they added increasingly larger HTML comments to a page. Their normal page size is 10Kb (gzipped) and they added up to 32Kb. It was only at this highest addition that they noticed a small decrease in clicks. The test was conducted for three weeks in the US only. This test is the most search specific of them all. Search pages are among the fastest served and lightest. Also as the additional payload was just comments there was no rendering cost added in the test.

The slides for the talk have been posted. Video will be up shortly.

tags:
  • http://scottru.com/ Scott Ruthfield

    Eric & Jake gave a great talk.

    One of the first questions that business leaders ask technical teams worried about performance is “why should I worry?” – getting real data on the value of improving performance is hard (you can’t really simulate it to your users), and we usually end up relying on anecdotes that fly around the web rather than real data.

    Kudos to both Microsoft and Google for going in reverse and evaluating the one-time and ongoing cost of degrading performance, and making that data public. Now we have some real data to work with!

    It’s also worth noting that while the differences they saw were slight in % terms, their performance differences were also slight – most sites I talk to aren’t worrying about 200ms or 400ms, they’re worrying about 4000ms or 8000ms.

  • http://radar.oreilly.com/jesse/ Jesse Robbins

    New features that slow a website down… aren’t features at all. ;-)

    -Jesse

  • http://friendfeed.com/shotzombies Mike White

    i hate slow pages.

  • http://www.google.com/profiles/preston.h.austin Preston Austin

    Back when under a few seconds load time was still considered a “good” page and aol dial up mattered a lot and still wreaked havoc with everything (late 90s) I was developing lots of little activities for a very popular children’s web site. We saw traffic in excess of 5 million dynamic hits on a good afternoon and wrote our own stats package – so it was easy to measure fine changes in user behavior.

    In an effort to increase kid’s satisfaction with the site, we started experimenting with simplifying pages to get some kind of page showing _fast_ and then precaching subsequent content in certain non-game activities so that things would be gamelike – and happen near instantly on a click.

    It was pretty interesting – we discovered that we could readily lengthen all of number of sessions, estimated repeat sessions, length of sessions, number of clicks per session, and most interestingly rate of clicks per session by pre-caching likely next moves while a child was viewing current content (we used various horrible hacks, and tools like Shockwave and later Flash to do this). To really make a difference, we had to get the response to clicks down under about 200ms.

    Just another anecdote really, and the data are proprietary and likely forgotten in history, but these results governed my design approaches for years: If the interface isn’t fun and responsive, its bad.

    Its nice to see some serious validation from a massive player who can afford to really study these things that these impacts are significant – and very interesting that they are enduring.

  • http://lantner.net/david/ David Lantner

    While the tests may have set a delay on the server-side, the origin of the Firefox/Firebug add-on YSlow comes from Steve Souders‘ research that identified the importance of client-side decisions, such as how/where/when to link to CSS, JavaScript and other page components. Google’s Firefox/Firebug add-on Page Speed continues in this trend, going so far as to analyze the grammar used within the CSS.

    These results provide useful data to support the (intuitive) assertion that speed matters – every millisecond – since they quickly add up to a degraded experience.

  • http://www.Cannabis.eu.com FaTe

    I agree with Preston, The approach to page loads and pre-formatted design is massively overlooked in today’s design and construct. Everyone seems to have assumed that since the launch of cable, wireless, ADSL etc that all pages will just load instantly fast.

    User monitoring and tracking personally is also what gives me most information when it comes to page refining and site design without relying on silly 3rd party tools.

    Meanwhile all these pages are being gzipped for compression ratio’s except the browser decompression time opposed to time saved in data transfer TO the browser is so minimal that sometimes it takes servers longer to generate content that it benefits the speed of the site server overall.

    Those of you refining your pages or such I would seriously recommend looking into Firefoxs Firebug

    http://getfirebug.com/

    Use the speed tool to evaluate your pages content, speed and more while providing relevant information on improving your page load speeds.

  • Gustavo Muñoz

    there is no need of such evaluation to realize that, i think this people likes to loose time or maybe just want to justify their “job”

  • Prankster

    This page took forever to load.

  • http://www.altvirtual.com naysh

    Well said Mike White.

    Everybody hates slow pages.

  • http://tantek.com/ Tantek

    Thanks for this summary Brady.

    It’s nice to see the anecdotal wisdom that a lot of us share confirmed by *two* independent data-heavy studies.

    It’s also nice to know that it’s not just us fast-twitch video-game-trained MTV-generation-short-attention span folks that can’t stand waiting fractions of a second, but rather, a measurable amount of web users in general.

    I will be citing this summary in future design/development work and talks.

  • http://www.topcoder.com/tc?module=MemberProfile&cr=22647580 Anil Kishore

    Also, there should be focus on middle ware (in networking) , instead of just focusing on Server side and Browser side.

  • http://www.keynote.com Vik Chaudhary

    Brad – awesome post. I attended the talk, and really liked your summary of it.

    Preston – what is your web site? I am impressed that you needed to get your page response time (or was it click response time) down to 200 ms. I’d like to try it to see what this snappy performance feels like. Not too many sites that aim for this level of sub-second performance.

    -Vik

  • Richard Hirtle

    Good work.

    It is about time webmasters and clients were taken to task for allowing such painfully slow pages.

    Every website and webpage I look after is faster than 95% of all others. Thanks for the info Google!

    I look forward to being able to tell my customers that said that page load time did not really matter, that, yes it does.

    Richard

  • Nealoren

    Thanks for information.

    but can you tell me how to minify javascript and css in website like asp dot net and wordpress.

  • andreimarceanu

    A web directory or link directory is a directory on the World Wide Web. It specializes in linking to other web sites and categorizing those links.[1]
    A web directory ex: http://www.microsoftcompany.com is not a search engine and does not display lists of web pages based on keywords; instead, it lists web sites by category and subcategory. Most web directory entries are also not found by web crawlers but by humans.[1] The categorization is usually based on the whole web site rather than one page or a set of keywords, and sites are often limited to inclusion in only a few categories. Web directories often allow site owners to directly submit their site for inclusion, and have editors review submissions for fitness.
    RSS directories are similar to web directories, but contain collections of RSS feeds, instead of links to web sites.
    Scope of listing
    Most of the directories are very general in scope and list websites across a wide range of categories, regions and languages. But there are also some niche directories which focus on restricted regions, single languages, or specialist sectors. One type of niche directory with a large number of sites in existence, is the shopping directory for example. Shopping directories specialize in the listing of retail e-commerce sites.
    Examples of well known, general, web directories are Yahoo! Directory and the Open Directory Project (ODP). ODP is significant due to its extensive categorization and large number of listings and its free availability for use by other directories and search engines.[2]
    However, a debate over the quality of directories and databases still continues, as search engines use ODP’s content without real integration, and some experiment using clustering. There have been many attempts to make directory development easier, such as using automated submission of related links by script, or any number of available PHP portals and programs. Recently, social software techniques have spawned new efforts of categorization, with Amazon.com adding tagging to their product pages.
    Directories have various features in listing, often depend upon the price paid for inclusion:
    Some web directory have : 23$ featured link or 12 $ regular link for life time ex:http://www.microsoftcompany.com
    Free submission – there is no charge for the review and listing of the site
    Reciprocal link – a link back to the directory must be added somewhere on the submitted site in order to get listed in the directory
    Paid submission – a one-time or recurring fee is charged for reviewing/listing the submitted link
    No follow – there is a rel=”nofollow” attribute associated with the link, meaning search engines will give no weight to the link.
    Featured listing – the link is given a premium position in a category (or multiple categories) or other sections of the directory, such as the homepage. Sometimes called sponsored listing.
    Bid for position – where sites are ordered based on bids
    Affiliate links – where the directory earns commission for referred customers from the listed websites

  • Andy

    I agree with preston to be honest, seems to be spot on in my opinion.
    http://www.reviewanygame.com

  • indir

    In an effort to increase kid’s satisfaction with the site, we started experimenting with simplifying pages to get some kind of page showing _fast_ and then precaching subsequent content in certain non-game activities so that things would be gamelike – and happen near instantly on a click.

    —————
    indir

  • Brian Garvin

    I agree with Jake and Eric. When I had a WP language translation Plug-In installed on my blog, it look at least 10 seconds longer for any page on it to download the data. Consequently I look in Google Analytics for that week and notice my bounce rate jumped from about 40 to 90 percent during the time this plugin was installed. So the moral of this story in laymen’s terms is if you make your visitors wait and your site takes too long to load they will leave, and probably not come back.

    —————————————
    Insanity Workout

  • http://www.techout.com TechoutNJ

    When a company’s web traffic decreases, the first thing a business owner should look at is the server speed. A slow server increases bounce rate, which decreases revenue or the growth of a data base. This idea of a slow site losing customers really emulates the phrase “Time is money”.

  • http://www.buy4little.com cartucce

    here is no need of such evaluation to realize that, i think this people likes to loose time or maybe just want to justify their “job”

    watch glee online

  • http://www.fitnessassessment.net/reviews/best-adjustable-dumbbells-bowflex-selecttech-552-dumbbells-review/ Harry Love

    Some very interesting statistics here. Page load is very important and its something ive been mean to look into for some time now for all my site. Transferring my site over to a cloud CDN has helped a lot.

  • http://porter.im Porter

    Those of you refining your pages or such I would seriously recommend looking into Firefoxs Firebug
    http://getfirebug.com/
    Use the speed tool to evaluate your pages content, speed and more while providing relevant information on improving your page load speeds.