How to Boost Your JavaScript and CSS Performance

Lines of markup code per website are growing at a hurried pace these days. JavaScript and CSS files are getting bigger by the day. But you want to keep your site lean and speedy and avoid from becoming slow and bloated. There are some basic techniques I use to optimize JavaScript files and the same ideas can be applied to CSS and HTML files as well. I’ll briefly cover 4 popular solutions, namely validating, joining, shrinking and compressing your markup source code.

1) Validate your code

Although JavaScript can be very forgiving when it comes to error handling, sloppy coding will come back to haunt you in the long run. So the first step is to validate your code. I use JSLint which scans JavaScript code for syntax errors and common mistakes and recommends fixes. JSLint piggy backs on Rhino which is an implementation of JavaScript written in Java from Mozilla. You can either use the online version or download and run the scanner locally. I prefer this local method which utilizes the java runtime.

To use JSLint locally:

  1. Download and unzip Rhino to a folder on your computer. You need js.jar
  2. Download and save the JSLint library file jslint.js
  3. Run JSLint in the Rhino Shell passing it a JavaScript file to scan. Here is the command I use assuming js.jar, jslint.js, and MYBIG.js are all in the same directory.
java -classpath js.jar 
org.mozilla.javascript.tools.shell.Main 
jslint.js MYBIG.js

You will then see a list of errors found by JSLint and suggestions for fixing your code. Some of the most common items are:

  • Missing end of line semi-colon delimiters ;
  • Not using dot notation to reference DOM objects (ex. form[‘name’].input.value instead of form.name.input.value )
  • Using == instead of === for comparing with null, undefined, booleans, zero and “”
  • Variable scope issues caused by declaring variables inside of loops instead of declaring them first in functions definitions.
  • Not using curly braces in your conditional blocks. So make a point to always do If(x){doX();}else{doY();} even if you have one liners.
  • Stay away from using eval() or with(). eval is evil 🙂
  • Create your arrays using literal notation [] instead of new Array()
  • Create your objects using {} instead of new Object()
  • Avoid using javascript:function() in URLs
  • You can read more suggestions in the JSLint docs.

2) Join your files

If you have five separate CSS files and three different JavaScript files, your server has to answer 8 HTTP requests per client. Each request requires additional processing and slows down your page loading. By concatenating your CSS and JavaScript files to only 2, you should see page load time improvements, especially when client requests are coming from distant networks.

Another tip that I have seen is to move your JavaScript include tags to the very bottom of the page, right before the </body> tag. This will allow the browser to draw the DOM before loading your library files and the page will appear to load faster.

So after your various source files are nice and lint free, and combined into a single library file you are ready to start shrinking.

3) Shrink your library

Before shipping your code, shrink it. JavaScript and CSS files are not compiled and they contain lots of wasted whitespace which are nice to have when you’re coding but serve no benefit to the user of your website. There are four different tools you can use to filter out the unnecessary bits from your JavaScript and CSS files, and for the most part they are doing the same thing. I personally use ShrinkSafe which is a happy safe medium. The reason is that this tool does not mangle your code so that its API will stay in tact. If you choose to use a decryption based method to obfuscate your code, it can have harmful side effects. First, the size of the encrypted file can in some cases actually be larger than the original! Second, your scripted API can get mangled, forcing you to update all external pointers to your code. If you have any clients who use your API via JavaScript, you should obviously take great caution in using encryption methods.

Caution: As always, backup your code before making changes to them using these tools! There is no undo!

JavaScript Shrinking

  1. JSMin removes extra white-space and comments. It also reduces the line count by increasing the maximum line width. The result a bit uglier and harder to read and debug but also a bit smaller. Unlike ShrinkSafe, though JSMin does not edit your function arguments. You can download it in a variety of flavors. Here is a DOS executable version and the command to run it.
    jsmin <big.jsb >min_jsmin.js "(c)2007 CORP"
  2. ShrinkSafe removes extra white-space and comments, but keeps one statement per line which is easy on the eye if you decide to edit your shrunk file. It also shrinks function arguments by replacing them with consecutive _## values which tends to reduce the files size some more. Download the jar file here and run it as follows:
    java -jar custom_rhino.jar -c big.js > 
    min_shrinksafe.js 2>&1
  3. Packer has a few options you can choose to minify your scripts. The normal option removes extra white-space and puts the entire file in one super long line. You can also choose to shrink function as well as encode your script using a base62 algorithm. You can try it online. Although your file size is considerably smaller, depending on the complexity and cleanliness of your code (see JSLint above) using the two options in Packer might introduce errors with your code. In my case I had no problems with them.
  4. MemTronic – Hardcore shrinkers out there can try this online tool which produces the smallest file size in my tests. Be warned that this is only for the braveheart coders, has some known issues and may have some side effects.

I have run Prototype.js against these 4 tools and the results are shown below. The results show that you can save at least 20% on your file size without compromising much stability.

Tool Used File Name File Size Lines Reduced
None prototype.js 71,260 2514 Original
JSMin min_jsmin.js 54,035 183 -24%
Shrinksafe min_shrinksafe.js 50,725 1972 -29%
Packer min_packer_noptions.js 53,455 1 -25%
Packer min_packer_shrunkvars.js 43,903 1 -38%
Packer min_packer_encoded.js 29,273 1 -59%
Packer min_packer_both.js 27,634 1 -61%
MemTronic min_memtronic_compress.js 24,483 1 -66%

CSS Shrinking

  1. Clean CSS – This tool creates the best results, reducing the files size while maintaining a nicely formatted and readable CSS file. It has a plethora of options to choose from as well. For example, you can sort the properties of all the selectors which is a godsend for neat-freaks like me. It will also compress your css by converting possible properties to use shorthand notations. You can choose to save your comments.
  2. CSS Optimiser – This tool only has one option which is to remove line breaks or not. Does a good job however the final file size does not decrease much if you don’t pick that option, which makes the css tough to read.
  3. Icey CSS Compressor – Here is another powerful tool with lots of options to go around. Here are a few of them: Convert colors to hex, combine identical rules, combine properties, combine selectors, remove useless padding and margin. It also outputs in a nice colorful format for you. However there is no option to keep line breaks in tact so you get the output all in one line, and in my case this actually created an error.
  4. Flumpcakes Optimiser – This tool has a few useful options such as combining backgrounds, fonts, lists, borders, and style which appear in multiple places. It can also convert RGB to hex and absolute to relative.

I ran the CSS from EarthLink ReaDeR through the top four tools and the results are shown below.

Tool Used File Name File Size Lines Reduced
None reader.css 12,875 227 Original
Clean CSS min_cleancss.css 11,735 940 -2%
CSS Optimiser min_cssoptimiser.css 11,304 2 -2%
Icey CSS Compressor min_icey.css 10,383 1 -3%
Flumpcakes Optimisor min_flumpcakes.css 12,885 796 0%

4) Compress your files

The HTTP implementation covers accept-encoding and content-encoding in detail and most browsers today support these methods. Basically when a browser sends a request, it tells the server it would like to have the file back in a compressed format such as gzip or deflate. The server then compresses the file and sends it back, and the browser decompresses the file and renders it in the browser. The most common format used today is GZip. Here is how to setup GZip in two popular web servers.

  1. Apache & mod_gzip – If you are running on Apache and have already compiled it with mod_gzip you can easily configure gzip compression by adding these lines to an .htaccess file placed in your web root or your http.conf file.
    mod_gzip_on Yes
    mod_gzip_item_include file \.php$
    mod_gzip_item_include file \.html$
    mod_gzip_item_include mime ^text/html$
    mod_gzip_item_include file \.txt$
    mod_gzip_item_include mime ^text/plain$
    mod_gzip_item_include file \.css$
    mod_gzip_item_include mime ^text/css$
    mod_gzip_item_include file \.js$
    mod_gzip_item_include mime ^application/x-javascript$
    mod_gzip_item_exclude mime ^image/

    If you don’t have access to your .htaccess file but do have php compiled with the zlib module you can rename your javascript files to .php extensions and add this to the top of your file.

    <?php ob_start("ob_gzhandler");
    header("Content-type: text/javascript;charset:UTF-8");
    header("Cache-Control: must-revalidate");
    $offset = 60 * 60 * 24 * 365; // one year
    $ExpStr = "Expires: " .
    gmdate("D, d M Y H:i:s",
    time() + $offset) . " GMT"
    header($ExpStr);
    ?>
  2. JBoss and Tomcat – To enable compression in a J2EE environment running Tomcat within JBoss, you can now enable compression by editing your server.xml that sits in the tomcat.sar folder. Just add these attributes to the Connector node:
    <connector port="8080" compression="on" 
    compressionMinSize="300"
    noCompressionUserAgents="gozilla, traviata"
    compressableMimeType="text/html,text/xml,text/css,
    text/javascript, application/x-javascript,
    application/javascript" />

The Results

Finally, here are two pages that use the compressed versions of prototype.js and reader.css, one with gzip compression enabled, and the other without gzip compression enabled. Using the Net tab in FireBug we can easily see that the gzipped version of both the CSS and JavaScript files actually result in less network traffic and faster loads times.

We cut our total throughput by 44 KB or 70%! Also the total load time is cut by 521ms or 51%, basically reduced from 1 second to half a second. If we do some quick math, for a website that gets 1,000,000 unique visitors per day, this is a daily savings of 44GB of bandwidth, and a total of about 145 precious hours saved.

Size in KB CSS JavaScript Total
No Compression 12KB 51KB 63KB
Gzipped 4KB 15KB 19KB
Reduction 8KB 67% 36KB 71% 44KB 70%
Savings for 1 mil uniques/day 44 GB

 

Time in Milliseconds CSS JavaScript Total
No Compression 281ms 741ms 1022ms
Gzipped 110 391 501ms
Reduction 171ms 61% 350ms 47% 521ms 51%
Savings for 1 mil uniques/day 145 Hrs.

 

13 Comments

  1. Great article and lots of good points raised. Bandwidth usage is becoming less/more of a consideration depending on the popularity of the website/app which you have developed.

    With this in mind sticking to the mainstay of keeping image sizes to a minimum and good clean markup should be first and foremost practice for the majority of developers/designers.

    However, if your site is attracting countless visitors then the above suggestions are priceless.

    Also, I think its worth pointing out once more that going down the compression route for files such as css/js etc really should be considered towards the end of the development life cycle. (tip! keep uncompressed backups)

    Nothing worse than trying to work backwards in order to take a step forward on a project you may not have worked on for several weeks. Even worse than that is some one else trying to work out what you did several weeks ago from a compressed file.

    Geoff

  2. Regarding the compression, I would have thought that Javascript files were mostly cached by the browser. So I don’t know if you’re 44 GB is realistic.

Leave a Reply

Your email address will not be published. Required fields are marked *

Visited 31279 times, 7 so far today