05 Dec

a few web optimisation tips

Once a website has been completed, I like to sit back and look over the site again, and see if there is any way to improve on its responsiveness easily.

An example immediately comes to mind. A few years ago, we built the CMS behind Castle Leslie’s website. I don’t think we were involved with the design – we were asked to implement the design in a user-editable way.

Notice that the menu on the left is graphical. The original way we did that was to slice it up and have a separate image for each link, and put each <img> in a <a> tag. I was asked recently to optimise that particular part of the site, as one of the owners of the site uses dial-up for connection, and was irritated that the menu came down in dribs and drabs.

The obvious problem was the number of images. I opted to rip them all out and use one single image instead. I set the image as the background of the entire navigation layer, and made the actual links invisible, and sized individually to cover the appropriate area of the image. This is similar to how an imagemap works, in that you have one image, and some clickable areas on it. A difference here, though, was that we have a few hover effects that I didn’t want to lose (small icons that appear below the nav according to what’s being hovered over).

That was a big improvement – instead of loading a load of different images, we now only loaded one.

However, it was still taking a while, because I had chosen to use CSS to add the background. I don’t know if this is a fact, but it seems to me that when you have a few actual images and a few background images in the same page, the background images are loaded last, as they are considered low priority.

A way around that is to use Aaron Hopkins’ tips and spread the load over a few separate domains. They don’t even have to be completely separate servers. Here is part of the virtual server config for castleleslie.com, for example:

  ServerName castleleslie.com
  ServerAlias *.castleleslie.com

What that means is that if you look for blah.castleleslie.com, it will actually show you castleleslie.com, as Apache believes they are the same thing.

So, to improve the responsiveness of our site, I then went through the CSS and used pseudo-subdomains for the images. For example, http://static5.castleleslie.com/i/style/back.gif instead of /i/style/back.gif. This meant that the browsers could immediately grab the images, instead of needing to queue up and wait (browsers usually can only download about two items at a time from the same server).

Now, onto AJAX.

AJAX involves sending information back and forth between the server and client. As you can guess, this also falls fowl to the connection limit – if you have three javascripts to load for your page, two will be loaded, and the third will not load until the first two are finished.

A while back, I demonstrated the problem with concurrency, and a possible solution. In the massively concurrent version of the script it took my laptop 30.02s to complete the tests. In the optimised version, it took 2.53s – an improvement of more than 10x.

What happened to make this improvement was that a load of server function calls were combined into blocks so that very few actual http requests were made. To do this, I made a local cache of requests, and set a timeout of 1ms on it. If another request came in, it was added to the cache, and the timeout was restarted. If the timeout fired, then all the requests were combined into a block and sent to the server, which extracted them, ran them, and returned an array of results. This was in comparison to the original, where every request was immediately acted on and sent to the server, resulting in massive traffic problems.

The lesson here can also be used elsewhere. For castleleslie, it occured to me that the time involved in downloading the HTML and separate CSS file was more than if the HTML+CSS were combined and downloaded as one. Yes, you lose out on the caching effect for subsequent calls, but in sites where the average user is only expected to read a few pages while looking for information, this is not really a problem. I would never try that on Wikipedia or a forum, for example.

By the way, I really advocate the use of Firebug – it’s a massive help when hunting down glitches in performance.

Another example of combinations can be seen in the panel headers of my KFM application (which is free, by the way – get it here). Notice that when the app loads, the panels have three buttons – ‘-‘, ‘+’, ‘x’. Some of the buttons are grayed out, indication that they are not usable. When you hover over an active one, it turns red.

There is actually only one image used for the entire header, including the effects.

CSS is used to shift the image up or down for each button, depending on what is needed.

The same trick is used for the Directory panel – there is actually only one image in there:

What’s important about this is that, instead of 10 separate images for the panel (the background, and 3 state images per button), and 7 separate images for the folders, single images, and a few CSS tricks are used to reduce the HTTP overhead.

Here are two demos to illustrate this in action. Please view the source to read how it works. demo 1, demo 2.

I could go on, but my fingers hurt. I hope some of this has been useful.

5 thoughts on “a few web optimisation tips

  1. Pingback: who the hell is Kae - Page 2 - Irish SEO, Marketing & Webmaster Discussion

  2. Thanks for the tips. I understand the Ajax request batching idea and the “one big image” over many small ones.

    Sorry to be so dense but I don’t understand how shifting an image up or down changes some of the color within that image. And I would need more explanation on your directory panel example in order to understand it as well.

    Peace,

    Rob:-]

  3. Hi Rob – I’ve done up two demos to illustrate the idea. I came across it a long time back on alistapart.com.

    demo 1, demo 2.

    The demos show two different levels of complexity. I haven’t done the folders one, but I’m sure you can do that using the ideas presented in the second demo (I’ll leave that as an exercise 😉 )

Comments are closed.