Pimp my Website: Downsizing your Web Pages

Creative Commons License b a r t
In the current economic downturn, everyone’s keeping an extra close eye on the bottom line and looking for ways to save money. Let’s compare the bytes of data on your network to water droplets and your website to a leaky faucet. You know you should take an afternoon and fix the damn thing, but something more important always comes up. But this doesn’t stop the monthly bill from coming, does it? Holy cow, those drops add up! Today, I’d like to share a few tips on how you can get this faucet fixed up to save on monthly bandwidth fees.

HTML Compression

There are a lot of different ways you can reduce the size of your served content. Perhaps the easiest (and most important) is using a server side compression library like mod_gzip (for Apache 1.x servers) or mod_deflate (for Apache 2.x). All modern browsers can identify and reconstitute compressed content and both of these libraries allow you to set the default compression algorithm (between fast and best compressed). An average compression rate is about 70% so if you’re currently paying any monthly overage fees to your hosting provider, check this out immediately.

Image Compression

Images used to be trickier. You’d have to play around with some locally installed picture editor trying to figure out which file type and which compression left the image pretty much the same but substantially reduced its size. Now, you can just head over to Smush.it and upload an image from your desktop or enter a URL of an online image for “smushing”. In my initial Website Performance Optimization post, I hesitantly added 5 fairly big images. While trying to keep their sizes as small as possible yet still be understandable, one pic ‘pingdom.png’ still weighed in at a hefty 52KB. After “Smushing it” :

Pretty cool! But then I dug around some more and found the Online Image Optimizer site from Dynamic Drive. Just for fun, I fed the the above image into it to see what it could do :

Wow! I can go from 30KB down to 6KB for a similar image (and if you right click on this image above, you’ll see that I did so). Great to be able to see in-line all the results of the various algorithms (and gives you a sneak-peek into how these tools work as well).

CSS & JS Compression

Depending on how desperate you are, you can even go as far as removing whitespace from large CSS and javascript files. There are lots of tools to help you with this but remember to do this at deploy time (so as not to make these files unmaintainable).

Even more hardcore, renaming (essentially obfuscating) your CSS class names to save on bytes transferred over the wire. Why specify


will do? Initially, the 30 byte savings may not seem like a lot. But if you’re using this to render every row in a 100 record dataset, you’ve just shaved 3K off the page size. Again, only do this obfuscation step as part of a pre-production deployment to your web servers.

To recap:

  • HTML : mod_gzip or mod_deflate
  • images : Smush.it and Online Image Optimizer from Dynamic Drive
  • CSS & JS : whitespace remover and obfuscation enhancement
  • Content Delivery Network (CDN)

    Why transfer the bytes yourself at all if you don’t have to? Google offers free hosting for the latest versions of many, well-known scripts like jQuery, Prototype and script_aculo_us. Take advantage of this! A lot of these requests will even be automatically routed to the closest regional cache (for improved download speeds).

    Depending on how large an audience you have, it might be time to step up to the big leagues. Akamai, BitGravity, Limelight Networks, or Panther Express all cost money but can improve end-user response times by 20% or more by serving up your static content from their huge, regional caches.


    The bane of privacy trumpeters, cookies are the de-facto standard for interacting with most popular websites nowadays. While standards are a good thing, abuses still run rampant especially in the sheer size of some of these cookies. Going through my cookies, I stumbled across this 1KB monster from komtrack.com:

    An entire XML document as a cookie – nice! No clue how often this had to be transferred back and forth but its definitely a great example of bad cookie practice.

    Another idea – if you use multiple domains for serving content, you can ensure that cookies are tied to only one of these domains. This will free your static requests (images, javascript files, etc.) from cookie overhead. Keep your cookies lean and mean and your users will feel the difference.

    You now have a few extra tricks up your sleeve to reduce your website’s overall bandwidth utilization and reduce your monthly hosting bill. Speaking of which, when’s the last time you actually reviewed this? Take a look at your monthly bandwidth limit, extra fee per GB overage and ratio of uploads to downloads. Does your hosting provider offer a contact that better fits your site’s usage? Remember, everything is negotiable! You could save yourself a load of cash without tweaking a single configuration file.

    Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out /  Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out /  Change )

    Connecting to %s

    This site uses Akismet to reduce spam. Learn how your comment data is processed.