Thursday, June 11, 2009

How does your site stack up?

I pulled together the results of the tests that have been run on WebPagetest over the past year and did a bunch of aggregate analysis. There were over 24,000 unique urls tested in that time so the data is a pretty wide sampling across different types of sites. You can see the full details here: http://www.webpagetest.org/forums/thread-22.html

One of the particularly useful things you can do is look at your own test results and compare them in the distribution graphs to see where you land. For example, are you slower than 95% of the sites that were tested? How do the number of requests and bytes stack up? How about the optimizations?

Looking at straight averages across all of the tests:

Load Time: 10.1 seconds
Time to First Byte: 1.1 seconds
Time to Start Render: 3.8 seconds

Page Size: 510 KB
Number of Requests: 50
Number of Redirects: 1

Perhaps more interesting are observations on the distributions:

Page Measurements

Load Time: 35% of the sites tool longer than 10 seconds to load (and there's a pretty long tail that goes out to 60 seconds with 5% of the sites taking longer than 30 seconds). On the positive side, 33% of the sites loaded in under 5 seconds.

Time to First Byte: Looked surprisingly good with 76% of sites coming in under 500ms. More confirmation that the back-end on most sites works well and the work needs to be done on the front-end (content). That said, 9% took over 2 seconds so there are some sites that still have some back-end work to do.

Time to Start Render: There is a lot of room for improvement and this is probably one of the most useful measurements (and unique to Pagetest). The user doesn't see anything display before this point so it doesn't matter how fast the back-end is if there is a lot of js and css code loading in the head that prevents the page from rendering (even worse is not much code but lots of files). 60% of the sites take over 2 seconds to start rendering with 20% of the sites taking over 5. If you're going to focus on optimizing anything, this is the first number you should be looking at.

Page Size: I feel sorry for anyone still using dial-up. 30% of the sites were over 500KB.

Number of Requests: This is usually the most impactful measurement because most of the time in making a request is wasted and not actually downloading content so the more requests on the page the more time that is being wasted not downloading content. 33% of the sites have 50 or more requests with 12% having 100 or more (with a really scary tail out to 400).

Number of Redirects: 66% of the sites had no redirects and in general things looked really good. The 2% of sites with over 8 redirects should probably look at reducing them though.

Optimizations

I won't go through all of them but I will hit the high points.

Most sites are doing a good job with persistent connections. Only 5% of the sites are not leveraging keep-alives at all.

On compression, 50% of the sites could save 50% or more of their text bytes by enabling gzip compression. This helps both the end user and saves bytes on the wire which goes directly to bandwidth costs the site owners have to pay.

The biggest impact for most sites comes in combining several js and css files into a singlee file (of each type). This goes directly to the start render time and 20-30% of the sites have a large number of files to combine.

The last one I'll touch on is caching of static assets. A full 25% of the sites don't use any expires or cache-control headers at all. That makes the repeat view of the site almost as slow as the first view (and makes a lot of unneeded requests to the site). If you don't want people to keep coming back, this is a sure way to encourage that :-)

There are a ton of charts and a lot more data in the full analysis so if you want more information head over there. I also have an excel spreadsheet of the raw data available at the end of the analysis if you want to run any different kinds of analysis on it.

2 comments:

  1. Hi Pat,

    Very nice to read the outcome of your analysis. Great insight.

    There are 2 things that I want to comment on.

    1. Start Render.
    You write the following about Start Render: "...The user doesn't see anything display before this point...".
    This is correct. Also, it implies that the user *will* see something at Start Render. This is true, but only in theory. I've done many, many tests using Pagetest in IE7 (love the tool!) and very often the Start Render time is not the time after which the browser actually shows the first element on the page.

    I think it is correct to say that the Start Render time is the time after which the browser *could* display something.

    How exactly does PageTest calculate/measure Start Render time?


    2. Number of requests.
    FYI, I can say for sure that more than 50% of the high-traffic websites in The Netherlands have more than 50 requests. I've seen 164, 169, 148 ... this really is the big problem.

    Looking forward to your next post.

    - Aaron

    ReplyDelete
  2. Thanks Aaron....

    1 - yeah, it's basically the point before which I can guarantee that nothing is visible but it doesn't guarantee that something actually is. On faster connections in particular IE looks to use a delayed rendering so sometimes you don't see anything until even doc complete.

    Pagetest checks the width and height of the document for when it is non-zero. It's on my list to see if I can better grab something ACTUALLY being displayed.

    2 - Interesting. It's scary how much the request count contributes to the overall time so that is unfortunate. If the sites are primarily local then it may not hurt as much but it's a safe bet that nobody will be visiting from outside of the Netherlands.

    ReplyDelete

All comments are moderated and may take a while to appear.

Note: Only a member of this blog may post a comment.