Measuring Typical User BandwidthWeb Performance
Estimates are no match for real data when it comes to user experience. Find out more about tools that Yahoo! and Facebook have built to measure real user bandwidth and learn how you could do something similar for your site too.
I'm part of the team at Betfair that is building our new site platform, which aims to significantly improve every aspect of our site. A major priority for the new platform is performance. It's one thing to say it's fast, but it's an entirely different thing to prove it. To be specific, we've committed to the following:
Under peak loads, with performance measured at the 95th percentile, for typical user bandwidths and a 0% error rate, our users shall experience Visual Progress (header loaded) in less than 1 second, Time to Interact with useful content within 1.5 seconds and full page loads within 3 seconds.Betfair Customer Commitment
There are a bunch of interesting points in that commitment, but the one I want to dive into now is what "typical user bandwidth" means. What is a typical user? How do we know that? Are we taking a best guess or is it measured? This is an unsurprisingly important detail. If we use the wrong value for "typical user bandwidth" in our performance tests, we may think we're giving our users a fast site when we're actually not. And there is plenty of evidence showing the implications of making the user wait. To mention just a few:
- 57% of online consumers will abandon a site after waiting 3 seconds for a page to load
- 8 out of 10 people will not return to a site after a disappointing experience
- Of these, 3 will go on to tell others about their experience
For many sites, performance testing is done using a reasonable estimate of what a typical user's bandwidth might be. This can be fairly easily determined if your site visitors are demographically similar or in a well-known geographic region. What do I mean by this? Well, if your typical user is a software engineer living in London and earning a decent salary, it's fairly reasonable to assume they have a decent internet connection ("decent" being determined a little more scientifically than that, of course). Or if your users are mostly based in Africa, it's reasonable to assume they have relatively low bandwidth internet connections. Add high latency to that if your site is hosted far away. When you have a very large user base spread across the world, it's nearly impossible to accurately estimate in the same way. If you're able to take real measurements, that's always preferable.
The best place to start when looking for tools like this is large sites with global user populations. The likes of Facebook and Yahoo! are no strangers to web performance optimisation and there's a lot to be learned from their experiences. There's an excellent article on how Facebook built Doppler to measure the network between them and their users. This doesn't appear to be an open source project, but there's enough detail in there to write your own. Before you dive into creating your own version of Doppler you should check out Yahoo! Boomerang. This provides a whole bunch of cool features, including measuring perceived performance, HTTP/network/DNS latency, bandwidth, etc. And it's free (source hosted on GitHub).
The statistics you gather by doing this should ideally be used for performance testing and simulation using tools like Charles. This will give you a far better understanding of what your users experience when they visit your site, which hopefully drives you to improve their experience however you can.
Go measure something
I hope this has sparked some interest in the topic of web performance optimisation, or given you some new ideas of how to accurately measure what your users are experiencing. Ultimately, these stats should be used to improve the service you offer to your users. Check out the links above on what the implications are if you get this wrong. It's far too important to be unaware of it.
So now you have your fancy new "internet tape measure"... go measure something!