It’s very important to speed up your web sites so you don’t loose the users.
That being said, many tasks in the checklists from Yahoo or Google are quite complex and are hard to grasp for people who didn’t work all their life configuring Apache or digging deep into the depths of HTTP.
To help this, I created a drop-in .htaccess file that will enable gzipping and long term expiration (helping with unique URLs for your assets).
Gzip-ing and infinite (1 year is long enough) expiration are enabled right away.
In order for you to not have problems with cached items (infinite expirations are good, not bad), you’ll have to change URLs of all your static assets next time you change them. All you need to do is to add .XXX in front if file extension and URL will be unique (still pointing at your file, thanks to mod_rewrite rules above).
So, for example, if you have logo.png, all you need to do next time you change it is to link to it as logo.1.png and next time logo.2.png and so on, this way cache will not be used and you will no longer have cacheing problem:
<img src="/logo.2.png" alt="Company Logo"/>
Go ahead and put it in the root of your site and let me know how it works for you.
P.S. it assumes that you use Apache 2.x and your hosting company configured apache to have mod_rewrite, mod_gzip and mod_expires modules, if they did not, just ask them, they should be able to do that easily.
13 thoughts on “Speed up your site – drop in .htaccess file”
Another small one, if you miss out the width and height on an img tag, then browsers have to GET the image, and read the first few bytes in order to sniff the size of the image, which is needed before the browser can layout the page (render it) – so by adding in the height and width to the img tag you’ll increase the user perceived speed considerably & allow the browser to render the page much faster :)
it’s ok to gzip .ico, they are compressible
but I think there could be more files to the exception list – like pdf, woff, zip, doc… I wonder it it’s not better to have gzip as white list, as opposed to non-gzip as a black list
in any event, this is great! I’ve always wanted to have one drip-in htaccess handy :)
I used to go by the ‘skip compression on images’ mantra as well. But in the last month I’ve looked at two different pages (on completely unrelated sites) that could have benefited by compressing at least some of the larger images on the site. In the first case it would have dropped the over all byte count of the page by 10k. In the second case it would drop the total byte count by more than 50k. I was surprised that these turned out to be such significant differences.
I’m not suggesting that turning compression on for every image is a good idea, just that leaving compression off for all images may also be a missed opportunity.
Thanks for reply Nathan, but I think you’re not exactly correct saying that this solution is making the caching worse.
Difference between Expiration (which is also a well known and old feature of HTTP) and E-tags / Last-modified headers is that in case of E-tags/Last-modified, browser has to send a request to the server in order to know if content got updated or not while expiration header just tells the server when to come for the asset next time.
E-tags and Last-modified headers are not affected by Expiration headers at all and will be used once the expiration time runs out and browsers goes back to the server to check if content got updated.
I wrote about it recently when introducing SVN Assets (a server-side PHP solution for URL versioning) – you can see the waterfall diagrams for both cases: http://www.sergeychernyshev.com/blog/caching-problem-no-more-svn-assets/
There is also documentation on Yahoo’s site that describes a problem very well if you’re interested:
As for the sniffing of the image sizes, I absolutely agree with you – there are other techniques (like the one specifying image sizes, for example) that speed up the site. The problem I’m trying to address with this solution is that people rarely understand the complexity of HTTP and server-side configuration and all that so they can just copy the file and don’t worry about details.
@Stoyan yeah, I agree about the .icos and whitelisting as well. Do you know a good source for the list? I simply copied mine from Apache’s configs, but would love to build something more universal and maintain it with new features and all.
BTW, sample nginx configuration file is also in there in case you’re into that ;)
@Joseph I think image compression is whole other story and should not be done using gzip but some tools like Stoyan’s Smushit.com and it’s probably part of build/upload proces. I use simple Makefile for that, but it might be too much for HTML devs and designers.
about the whitelist, it may probably be better to go with content type, not file extension
for example if you gzip all text/plain that’s pretty good, any weird extensions will be compressed without extra effort. For example many servers send font files with text/plain
you know, it would be a good idea to setup a git repo and treat this as a project although it’s only one file. This way ppl will know where to get the latest version, contribute, file bugs and all the good stuff
@Joseph, I’d be interested to see those image files. I’m pretty sure the compression benefits come from compressing textual meta data left there from various tools (e.g. Photoshop). All that meta probably shouldn’t be there to begin with, it would’ve been stripped by the build tools – jpegtran, optipng, etc
@Stoyan – this one file is already in SVN (it’s actually part of SVN Assets). I know that Git is social network for code and SVN is more like team blog ;) but SVN is much more friendly to target audience – git might be much harder for them to use.
Do you know if Github allows for Git-to-svn synchronization to Google Code? I’d love to have developers play using Gitgub, but users download it using SVN.
On the other hand, it’s just one file and raw output would be good enough…
Safari has issues with gzip’ed css; Also gzipping can have issues with anti-virus reguarding them as suspect
github supports svn as well as git
@Majic yeah, I think there are many use cases for troubles with gzip. Do you know if there is a good place with comprehensive list of gzip exceptions?
I never heard of anti-virus suspecting gzipped content, only that they disable Accept-Encoding request headers sometimes. Do you know specific use-cases that I can look at?
And thanks for the news about GitHub – I just found a news from April 1st that they now support read-only SVN checkouts! That’s great news for many projects!
@Sergey Nicole S (stubbornella) mentioned the troubles with AVG – no specific info to go on really – but echo’ing the wisdom of others :)
as Stoyan says a drop in htaccess is a nice idea and will be of benefit to many. Getting it on github will mean many others will fork contribute (others will include in projects) – and though the wisdom of crowds this could become comprehensive – to me that sounds very alluring and will appeal to many
if an anti virus pro disables those headers is the respective file valid over http or an error – eg would the gzipped style not be applied? this is what I have feared and so have been cautious – yet I have not seen it
Oh I have a vague sense that there is an article on smashing magazine which some information which should assist here (I say vague it might not be smashing mag)
I share the concerns about forced gzip.
I have some different techniques in the html5 boilerplate’s htaccess:
Paul, I agree, it’s probably a good idea to create a better set of rules. Mine were pretty much the simplest I could come up with in a short period of time.
I’ll move this file to be standalone on github (http://github.com/sergeychernyshev/.htaccess) and will borrow some of your rules if you don’t mind.