Steve Souders asks: "How green is your web page?"

Steve Souders, my Velocity conference Co-Chair and author of High Performance Websites, gave me permission to repost this great analysis:

How green is your web page?

Writing faster web pages is great for your users, which in turn is great for you and your company. But it’s better for everyone else on the planet, too.

200803071824.jpg

Intrigued by an article on Radar about co2stats.com, I looked at my web performance best practices from the perspective of power consumption and CO2 emissions. YSlow grades web pages according to how well they follow these best practices. What if it could convert those grades into kilowatt-hours and pounds of CO2?

Let’s look at one performance rule on one site. Wikipedia is one of the top ten sites in the world (#9 according to Alexa). I love Wikipedia. I use it almost every day. Unfortunately, it has thirteen images in the front page that don’t have a far future Expires header (Rule 3). Every time someone revisits this page the browser has to make thirteen HTTP requests to the Wikipedia server to check if these images are still usable, even though these images haven’t changed in over seven months on average. A better way to handle this would be for Wikipedia to put a version number in the image’s URL and change the version number whenever the image changes. Doing this would allow them to tell the browser to cache the image for a year or more (using a far future Expires or Cache-Control header). Not only would this make the page load faster, it would also help the environment. Let’s try to estimate how much.

  • Let’s assume Wikipedia does 100 million page views/day. (I’ve seen estimates that are over 200 million/day.)
  • Assume 80% of those page views are done with a primed cache (based on Yahoo!’s browser cache statistics). We’re down to 80M page views/day.
  • Assume 10%, no, 5% of those are for the home page. We’re down to 4M page views/day for the home page with a primed cache. Each of those contains 13 HTTP requests to validate the images, for a total of 52M image validation requests/day.
  • Assume one web server can handle 100 of these requests/second, or 8.6M requests/day. That’s six web servers running full tilt year-round to handle this traffic.
  • Assume a fully loaded server uses 100W. Six servers, year-round, consume 5,000 kilowatt-hours per year or approximately 500-1000 pounds of CO2 emissions.

I think this is a conservative estimate, but there are a lot of assumptions above. And six servers doesn’t sound like a lot. 5,000 kilowatt-hours is a drop in the bucket if you look at data center power consumption. But this was just one rule on one page on one site. Think about the impact of not gzipping, not minifying JavaScript, wasteful redirects, and bloated images. If we extrapolate this across all the performance rules across all sites the numbers are much bigger.

Make your pages faster. It’s good for your users, good for you, and good for Mother Earth.

-Steve

Steve has a SXSW Bookreading on Saturday @11 AM, and will be at the O’Reilly booth on Sunday from 3:30-4:30. Stop by and say hello!

tags: , , , , , , , ,

Get the O’Reilly Programming Newsletter

Get weekly insight from industry insiders—plus exclusive content, offers, and more on the topic of software engineering.