Shrinking JS or CSS is premature optimization

Rick Strahl has a post on a JavaScript minifier utility the sole job of which is to shrink the size of your JavaScript whilst making it almost impossible to read in order to save a few kilobytes.I thought I’d take a quick look at what the gain would be and fed it the latest version (1.6) of the very popular Prototype library:

File (KB) GZip (KB)
Standard 121.0 26.7
Shrunk/minified 90.5 22.0
Saving 30.7 4.7

The 30.7 KB saving looks great at first glance but bear in mind that external JavaScript files are cached on the client between page requests and it looses some appeal.If you also consider the fact that most browsers and clients support GZip compression and the savings there are around 4.7 KB and you might wonder if you are wasting your time.In computer science there is a term for blindly attempting to optimize systems without adequate measurement or justification and that term is premature optimization.As Sir Tony Hoare wrote (and Donald Knuth paraphrased)

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

And he was working on computers throughout the 60’s and 70’s that had much less resources than those today.By all means if your server bandwidth is an issue delve into the stats, identify the cause and take it from there. Going with Yahoo’s YSlow plug-in for Firefox/Firebug is a great starting point but remember to analyze the statistics from your own context.

Rick’s tool had shortcomings with non-ASCII characters such as accents, symbols and non-US currency symbols which goes to show how optimization can have other unintended and undesirable effects.

[)amien

8 responses to Shrinking JS or CSS is premature optimization

  1. Avatar for

    Information is only used to show your comment. See my Privacy Policy.

  2. Avatar for gibbitz
    gibbitz

    So it’s a million years later and javascript on the command line has made build systems a de facto standard on every front-end project ever. The mantra on http requests for instance has driven us to base64 encode our images into CSS that is inlined in our JS bundles so that we have one request for all assets. We use JS libraries that are more than 1MB gzipped causing these bundles to be upwards of 4MB. Configuration and maintenance of these build systems takes weeks in the overall SDLC. Not sure we’re better off than in 2007…

  3. Avatar for Andrew Donaldson

    Hey Damieng,

    Someone had a similar discussion a while ago which ended out in the creation of a rails plugin called ‘asset packager’ (http://synthesis.sbecker.net/pages/asset_packager). I think this is part of Rails now (or will be in future) but the underlying concept is probably one that can be ported.

    Basically, whilst in development mode you access your files as normal. When your site is set to production mode, it looks for the compressed script files which are built (and compacted into one file as Corey mentioned) using a build script when you update your site.

    The compressed files don’t go anywhere near your version control, and as long as you keep your javascript tidy (semi-colons, damnit!) it melds into your workflow.

    Couple this (as you’ve said) with server-side compression and you’ve got a nice ‘free’ performance boost.

  4. Avatar for Corey

    Another advantage of a tool like Ricks is that it can consolidate many javascript files into one. This has the effect of reducing HTTP requests on top of the smaller file size. Reducing http requests and minimizing javascript and css files are both reccomended by YSlow

  5. Avatar for Damien Guard

    Yeah I’d agree that having shrinking as part of the build process to release/production architectures has some merit and avoids the major issues leaving only the question of trying to debug production-only problems where the error could be the result of changes between debug and release but it’s pretty rare.

    Like most languages JavaScript should be particularly well suited to the Huffman algorithm of GZip and combining multiple JS files into one could see further gains.

    Thanks for the compliments on the posts, trying to keep them more regularly paced than before but I’ve now run out of hashing algorithm filler ;-)

  6. Avatar for Reggie Drake

    Definitely don’t minimize scripts on sites you are still working on -- but in production, there isn’t that much added value to having readable scripts. Files with much comments can often be shrunk to half their size or smaller.

  7. Avatar for Dave Transom

    Hey Damien,

    Seeing the difference in the ‘gzip’ vrs ‘gzip and minified’ version of the library certainly seems minimal, so you could take it or leave it.

    I think Gavri makes a good point. It’s a fairly common and well known optimisation to minify and compress javascript and css files — every little bit helps, especially since we seem to be writing bigger client side files. It’s not exactly premature, but there is credence in your words “measuring the problem and result first”.

    We’ve been using YUI-Compressor as an MS Build task using Web Deployment Projects, so the source is minified as part of the release deployment process — and it rocks. It means we can develop and debug with full source, pass QA, then deploy the production ready code. It’s simple, once established takes no additional time, doesn’t interfere with the development process and gives the best performance on the client side. Actually, if you use a fairly descent minifier, you can pick up a few warnings you may have missed during development.

    I’m a firm believer that all changes (projects for that matter) should be source controlled. Here, minifying the source has an added benefit of deterring minified files being modified — at least, I’d hope they’d think twice, looking at long lines of code, with the only meaningful member names being on the public side. More than likely their non-source controlled items will be overwritten at the next deployment.

    Keep up the good posts (I’ve been subscribed for about 10 posts now :)

    Dave

  8. Avatar for Damien Guard

    It’s premature if you are optimizing before you have a specific measurable problem and it will negatively impact the development process.

    The code output by the minifier and other shrinking tools is incredibly difficult to work with should you need to debug or change it.

    Sure you might have another unshrunk copy somewhere but what if somebody has modified the compressed version. How exactly would you diff that?

  9. Avatar for Gavri Fernandez

    I don’t see how compressing responses is premature. It’s not early in the development process. In fact, it can be the very last step.

    Premature optimization is evil because you optimize with vague notions and at an early stage. At a stage that hinders development.