Automatically Compress/Sync Your S3 Hosted Static Website

For those of you that use Amazon Web Services (AWS) Simple Storage Service (S3), you might find yourself wondering how to serve gzip compressed files for better performance. While the instructions available online work well, manually compressing, renaming and editing metadata of all your hosted files can be super tedious; let’s see what we can do about that.

In order to setup, we’re going to need to have installed and configured s3cmd. For OS X users, this is available via Homebrew with simply  brew update && brew install s3cmd && s3cmd --configure .

Once that’s setup, we’re pretty much ready to write the script.

I saved this as a script called  gzsync and put it in my scripts directory (part of my $PATH , chmod a+x ~/.bin/gzsync ).

Now, say we’re in our static website directory that we want to upload to S3.

This would be a clever git post-receive hook to automatically deploy our site, but we’ll save that for next time. For CloudFront users, this would be a good time to invalidate your distribution. We can actually script this, and if you would like to contribute this, that would be great!

If you would like to make any modifications to the script and recontribute them, I have put this up as a GitHub gist.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">