JavaScript performance

Taxonomy upgrade extras: 

The problem

How many times you have to include multiple JavaScript files in your HTML pages? Do you have sections like this in your code:

<script type=”text/javascript” scr=”/js/global.js”></script>
<script type=”text/javascript” scr=”/js/ua.js”></script>
<script type=”text/javascript” scr=”/js/menus.js”></script>
<script type=”text/javascript” scr=”/js/ajax.js”></script>
<script type=”text/javascript” scr=”/js/functions.js”></script>
<script type=”text/javascript” scr=”/js/extra.js”></script>
<script type=”text/javascript” scr=”/js/aux.js”></script>

This example, not untypical for many complex sites, will cause your browser to request 7 extra JavaScript files from the web server! True, if you use a modern HTTP server, for all but the first requests the response will be 304 (not modified), but making 7 TCP/IP sessions will be quite expensive for both sides (browser and the server). Even if keep-alive feature of HTTP/1.1 is used, the overhead cannot be neglected.

So, what can be done?

If your site is a static one (from web server's point of view, of course), and you intent to keep it that way – not much (but keep reading). However, if you can somehow run dynamic code on the server, we can do “JavaScript netting”.

JavaScript netting

What does it mean? Well, basically, we have a clever server-side code, which is able to read a set of predefined JavaScript (or CSS or you name it) files, and concatenate them all together in order to produce a single stream of output. In pseudo-code its behavior might look like this:

Request for /dyn-js/all.js received
Locate configuration settings for /dyn-js/all.js
Decide, whether to send 304 or the content
If content to be sent, scan all the files and stream them out one after another

Simple, isn't it? And it doesn't have to be too complicated. In fact, creating such unified files can be done by hand or by deployment script, and it is definitely an option. However, if you update your site once in 2 years, and then you happen to change one of the individual files but fail to recreate the combined one, strange things might happen.

Please note, that in its basic form, there is nothing JavaScript-specific in the proposed solution. It can be applied to other types of content – for instance to your CSS files.

Things to consider

Of course, before you actually start sticking your JavaScript files together, you should think whether it can cause any problem. Will there be a name collision, for instance? In general, it shouldn't – because the individual files you used to load before ended up injecting their JavaScript code into single namespace, so if there was no collisions with the old approach, there won't be with the new one.

One of the things you probably should think about is the order of invocation. In your original HTML page the files were loaded in the same order they were listed (unless you played tricks with "defer" attribute - not very portable!). However, the order in which your files will be scanned on the server might be different! So the best approach would be not to have any dependency between the files or, alternatively, to list them on the server (in appropriate configuration place) in the right order.

Implementation ideas

The important bit is, of course, how far we take it. The following ideas are given to the curious reader as an exercise to implement in his/her programming language of choice:

  • Caching of generated JavaScript files in memory / on disk
  • Aggressive caching (not validating the underlying files for some period or number of requests)
  • Support for If-Modified-Since request header
  • Support for compressed output (Accept-encoding request headers), with optional pre-caching of compressed output
  • Do include some content-specific processing – for instance, strip comments and extra white spaces from JavaScript and CSS, rename local variables in JavaScript functions and so on. This is quite complicated, since a proper parser for the language in question is required.

Comments

  • 3rd party JavaScript compressors are available on the Net; I once used one, it worked fine.
  • A minor augmentation would be to time-stamp the js filenames, assigning a new filename with each update. This assures that a zealous proxy does not serve the old version to unsupecting users. This does not cost a dime, so I use this convention even in my just-5-pages personal site to designate the single CSS file: kc_2008_01_01.css

Mozilla has just landed a new JavaScript optimization feature to Firefox 3.1 development code base (Shiretoko) that effectively enhances JavaScript-based web applications performance by a 2X - 20X fold compared to the already-severely-pumped-up Firefox 3, according to a variety of JavaScript performance tests

I used to put scripts at the bottom, but moved them back to the head recently.

Reason being, I was using swfobject to write a flash movie into the page and it looked awkward for the whole page to load with a ugly bare content area that was going to be replaced by the swf.

Add new comment