|March 6th, 2014|
|nginx, pagespeed, tech|
When I was in California last week I gave a short talk at the NGINX User Summit on ngx_pagespeed. While people often post the slides from talks like this, my slides are terse enough that I should probably include some explanation:
A major reason to use NGINX is because it's fast. NGINX really is very fast at what it does: moving bytes to the client. But that's not the only thing, or even the main thing, that makes for a fast experience from the perspective of a visitor to your site. Instead it's a matter of what you put on your site, how many bytes that is, how many round trips the client needs in order to download it, and how much of that work the browser can do in parallel.
The most common approach here is manual optimization. To compress your images so they require fewer bytes to download you "save them for the web".
To speed up page loads when visitors load other pages on your site or come back later you can set up "longcaching".
This means setting Cache-Control headers that allow the browser to keep a copy of the thing it just downloaded in its cache for a long time.
When you're using this approach, however, you need to make sure that when you do change something visitors will see the new version. This means changing the url whenever you change the content.
Another way to reduce round trips is to combine all your images into one larger image, with spriting. Again, this is a big hassle to do on your own.
PageSpeed can help with this.
The idea is you write your page naturally, not worrying about caching headers, minifying your css, or optimizing your images, and then use PageSpeed to apply these optimizations on the fly.
There are also really powerful things you can do once your front-end optimization is consolidated into one optimized tool. A big one is that you can now run experiments to figure out exactly the optimizations that are right for your site. While some changes (minifying css) are just always good other changes (inlining, spriting) are generally good but depend on the site. Running an A/B test to figure out if a given optimization improves your loading times can answer this for your site, and PageSpeed makes this relatively easy.
Another advantage to a dynamic optimization tool is that some optimizations can only be performed dynamically. Say you currently use one big stylesheet, main.css, for your whole site, but most pages only use a few rules from it. PageSpeed can inline just those rules, avoiding a round trip that delays rendering.
That client-side code determines which selectors applied, which it can then beacon back to the server to use to optimize future pages. Without actually loading up the page in a real browser you don't generally know which rules are needed and which aren't, but in a dynamic setup we can get the client to help out.
The takeaway: use ngx_pagespeed and stop optimizing by hand.
- Getting Myself to Eat Vegetables
- Objecting to Situations
- John Wesley on Earning to Give
- Instrument Complexity and Automation
- Is Unicode Safe?