art with code

2013-05-16

Adventures in page speed optimization

I started building this little virtual gift box / greeting card service a few months ago, based on a birthday card I made for my wife. My progress on that has been quite stop-and-go due to paying freelance work taking priority. And inevitably, I got sucked into the whole spiral of optimizing page load speed (instead of, you know, making the graphics amazing and finding out if people actually want to use it. Oh well, failing builds character, no?) Anyway, here's the story of the page speed optimizations I've got in place thus far.

The page I started off with was a pretty basic setup with an HTML file with stylesheets in link tags, a half dozen PNGs and a bunch of scripts with a window.onload handler to start running the animation loop. Nothing too exciting, and not really all that fast to load. And frankly, the page load times were perfectly acceptable. I just got into the optimization thing. So hey. Go for it, eh?

My first step was to deploy the whole thing to S3 with a small script that sets up the Expires-headers as I want them and sends all text content over gzipped with the Content-Encoding header set accordingly. I don't actually have any uncompressed assets on the server, everything's compressed beforehand and set as Content-Encoding: gzip. So I don't need a server to do on-the-fly compression and save a few kB of disk space (ha.) The Expires-headers are set to one day for HTML and JS, and a year for the unchanging images. Still, I need to go and update the headers daily as they're static on S3.

The second thing I did was set up the serving from the Amazon CloudFront CDN, to make the page load reasonably quickly from locations around the world. CloudFront is pretty easy to get set up, just make it track an S3 bucket and create a CNAME record to give the CloudFront distribution a more friendly name. I'm serving everything from CloudFront at the moment, which does make testing changes on live a bit of a hassle as I need to invalidate the index.html document to force updates, and those take a half hour or so to go through.

With the simple things out of the way, I started changing the page structure and loading order to make things faster. I had a bunch of social buttons on the page, which were causing extra scripts to load and some maybe even some rendering hiccups. Let's load those only when they're needed. I've got a front page that doesn't need any of the social stuff, so I made the social scripts get injected by JS when flipping to the sharing page. Now the front page is fast to load and doesn't make spurious script loads. The only problem was that I had to go and figure out how to call the social scripts to make them initialize the button code on the sharing page.

I also used the Closure compiler to bundle and minify all my several JS files into a single bundled file, getting rid of several HTTP requests during page load in the process. The amount of CSS I had was small enough that I decided to just inline it into the page and save an HTTP request by doing that as well. Later on I moved to using Compass/SASS to compile my CSS and then inlined it into the page in my build script. I also ended up inlining the JS in the build script to get rid of another HTTP request.

After inlining the scripts and the CSS into the HTML, I was down to a single HTTP request for the textual page content. I started off with around ten HTTP requests for loading the page, so this made the page a good deal faster on bad networks. And as S3 charges per request, it's also a wee bit cheaper.

I still had seven images loaded on the page, keeping the page request total at around 10. And they were a bit large filesize-wise. So first I used Compass's sprite generator to bundle all the images into a single image sprite and then used ImageOptim to squeeze the resulting PNG even smaller. This got my page HTTP requests down to 3: the HTML with the inlined JS and CSS, the image sprite, and the favicon.

I also wanted to have Google Analytics on the site to figure out how people are using the site and how I could make it better. The Analytics script is kind of like the social scripts: it injects a script tag to the page, which may slow things down a bit. Well, eh, let's take control over that aspect as well. I moved the Analytics script to get injected after the window onload event fires and the first frame is ready to render. Which got my time from navigation start to first frame down by a little bit.

On my machine with cold cache, the page now renders the first frame in around 120 ms from the start of navigation, depending on CPU speed, network latency, DNS and the phase of the moon. On hot cache the first frame is at around 60-70 ms. So it's pretty fast to load. Still, I'd like to knock down the current 60 kB gzipped page weight to something smaller. Perhaps replace the images with procedural stuff. On second thought, screw that. Make better art instead.

Thank you for reading all the way here, have a cookie. Also, all feedback welcome. Cheers.

Post a Comment

About Me

My photo

Built art installations, web sites, graphics libraries, web browsers, mobile apps, desktop apps, media player themes, many nutty prototypes, much bad code, much bad art.

Have freelanced for Verizon, Google, Mozilla, Warner Bros, Sony Pictures, Yahoo!, Microsoft, Valve Software, TDK Electronics.

Ex-Chrome Developer Relations.