Tuesday, January 03, 2006

Improving webpage performance

There is a ton of research on user behavior based on task response time and I'm sure if you search Jakob Nielsen's site you'll find something.

I remember an IBM study made more than 15 years ago that said if it took 1 second for the computer to respond, it would take 1 second for the user to start the next step, but, if it takes 10 seconds for the computer to respond, it takes 20 seconds for the user to start the next step on his task. It is an exponential problem because users get more and more distract the longer it takes for a command to respond.

Web sites are no different. Google proved with its super fast pages (part real, part perception) that users will perform many more searches if they know they don't have to wait long. If each query on Google would take 20 seconds to respond, you are more likely spending more time thinking about what search terms you'll use. But since it takes just 3 seconds, you just keep adding and removing search terms.

So, during the past few weeks I've been collecting tips for Web Developers on how to improve their page speed. These tips effect latency, bandwidth, rendering and/or perception of when a page is ready. They are in no particular order.

Tip #1: Strip spaces, tabs, CR/LF from the HTML

I'm always surprised when I look at some large website HTML to find out that it has a ton of unnecessary spaces, tabs, new-lines, HTML comments, etc. Just removing those elements can reduce the page size by 5-10%, which in turn can decrease the download latency. I'll go one step further and say to you to not use quotes on attributes unless necessary.

Tip #2: Don't use XHMTL

This is very controversial. A lot of people will call me crazy, but I see XHTML as a loser technology. It has its benefits, but they are far outweighted by the drawbacks. And the biggest drawback for me is that XHTML makes your page larger. Purists will always build their page on XHTML, but if you are in doubt about using it or not, don't!

Tip #3: Keep Cookies Small

Your cookie is sent back to your server every single time the user makes a request for anything. Even with Images, JS, CSS requests or XML-over-HTTP (AJAX) the cookie is sent. A typical HTTP request will have between 500-1000 bytes. Now, if you have 4 cookies each with names like "user_name" followed by a value with 30 characters, you are adding 15-30% more bytes to the request.

Tip #4: Keep JavaScript Small

Who cares if I'm calling my JavaScript function "start_processing_page()" or "slp()"? The download speed cares and the interpreter cares as well, so, use tiny function and variable names, remove comments and unnecessary code.

Tip #5: Use Public Caching

IMHO, This is one of the most under-used features of HTTP. Big websites use it (usually through a CDN, like Akamai), but the vast majority (I dare to say 99%) don't. All those icons, CSSs, JS can, and should, be cached by the browser (Private Cache), but public caching also allows Proxies in-between to cache them. This reduces the load on your server, allowing more CPU and bandwith to do the important stuff. Now, a lot of people don't use Public caching (or even Private) because their CSS is changing, the JS has bugs that need to be fixed, etc. Well, you can do 3 things to deal with that. 1) Let content to be cached for a short period of time (for example, 24h only). 2) Rename the files every time you make a change to them, this way you can let it be cached permanently, or 3) Implement an HTTP Filter that automatically renames the file if they have changed.

Tip #6: Enable HTTP Compression

Your HTML couldn't be a better candidate for compression. It has a very limited character set and lots of repetitions (count the number of "DIV" on your page). That is why HTTP Compression makes so much sense. It can reduces the download by 70% or more. So, instead of having to send 40KB of data, you are sending just 15KB. The user will thank you.

Tip #7: Keep all as much as possible in lower case

This actually works in conjunction with HTTP compression. Remember that this type of compression is lossless, this is, decompressing a content will yield the exactly original, which means that the compression algorithm will treat "DIV", "Div" and "div" as different streams. So, always use lower case for tag names and attributes on the HTML and CSS. Also try to be consistent on your JavaScript.

Tip #8: Avoid Tables

Rendering a table is probably the worse nightmare for a browser. If the browser starts showing the table before all the content inside it is loaded, the browser's rendering engine will have to re-render it many times as each piece is loaded. On the other hand, if the browser needs to wait for everything to be loaded, the user we see a blank page (or partially blank) for a few seconds. Browser's usually use a combination of both to reduce the number of re-renderings without leaving the user hanging in there. The point is, don't make your whole page start with a table. It is preferrable to have 3 tables (header, body, footer). Whenever possible, just avoid using tables altogether.

Tip #9: Set image size

This is very similar to the table rendering problem. If you add an IMG tag to the middle of your page and don't set "width" and "height", the browser has to wait for the image to be loaded to decide the final size, but, meanwhile it will cost the browser at least 1 re-rendering because it will not wait for all the images to be loaded to show you the page.

Tip #10: Compact your GIF/JPG

So, your page has several GIFs and/or JPG? It is very likely that those could be compressed even more without any loss! GIF/PNG mainly have a very compact data structure, but most applications like Corel Photo-Paint and Adobe PhotoShop don't optimize it at all. Go to http://download.com and find yourself a good set of tools to compact your image files. You will be surprised that one of your GIFs had 900 bytes and after compacting it, end up being just 80 bytes.

Tip #11: Reduce the number of external elements

If you see a request graphic from Keynote (a site perf monitoring service) you would be shocked at how long it takes to download just a few extra files to render a page, like a few images, a CSS and a JS file. If you did a good job with Tip #5 (using caching), the impact will be lesser. A browser can only request an image file, after it detected it on the parsing of the HTML. A lot of those file requests are serial. Some browsers limite the number of TCP connections to a single server (usually to 2), thus, allowing your page to only download 2 files at a time. If you have 1 page, 1 css, 1 js, and 7 images on your page (10 files), you can imagine that a lot has to happen before everything is loaded. The point here is, try to reduce the number of files (mostly images), and, if the CSS/JS are small enough, embed it into the page.

Tip #12: Use a single DNS Lookup

This is so overlooked. How many Web Developers think about DNS Lookup when they are building a site? I guarantee you, not many. But even before the browser opens a connection to your server, it needs to do a DNS Lookup to resolve the domain name to an IP address. Now, DNS lookups is one of the fatest things on the Internet, because the protocol is tiny and it is cached everywhere, including the user's computer. But, sometimes you see sites making "creative" domain names for the same server. Like all images come from "images.mysite.com", the page is coming from "w3.mysite.com" (after a redirect from "www.mysite.com"), and the streaming video comes from "mms.mysite.com". That is 3 DNS lookups more than necessary.

Tip #13: Delay Script Starts

If you have a process that renders 100 images per second using 100% CPU, and you add another process doing the same thing, the performance will be less than 100 images per second (less than 50 per process). That is because now the OS has to manage context switches. The same thing applies the scripts on your page. If the browser stills loading and processing a few images, or CSS and you just fire a script, it will take longer for that script to execute than if you had waited the page to be completely loaded. Actually, it gets a little bit more complicated. The browser fires the "onload" event for the page once it has all the elements necessary to render the page, not after the page has really been rendered (there is no "onrendercomplete" event). This means that even after the onload event, the CPU still being used by the browser to render the page. What I usually do in situations like this is to add two indirections. First, attach a script to the onload event to invoke a function that will create a time-event in a few seconds that will do the real initialization.

Tip #14: Watch for Memory Leak

The biggest problem with browser's memory leak is that it doesn't affect only the page that created the leak, it affects every single page from any site after that. Internet Explorer is notorious for its massive memory leaks (becase of poor JavaScript). There are a few tools on the Internet to find out if your script is causing memory leak and where. The easiest test is to load your page 100 times and watch PerfMon to see if the Working Set is growing or not. The most simple thing that you should do is to unbind every event that you bound to (dynamically), and to release every reference possible (this also helps the JavaScript garbage collector to be faster).

If you have no clue what I talked about in one of the topics above, either you really don't need to know about it, or, you should immediately go buy some books, and I recommend all books by O'Reilly, like:

  • Dynamic HTML: The Definitive Reference - by Danny Goodman
  • JavaScript & DHTML Cookbook - by Danny Goodman
  • HTTP: The Definitive Guide - by David Gourley & Brian Totty
  • Web Caching - by Duane Wessels
Can't find what you're looking for? Try Google Search!
Google
 
Web eshwar123.blogspot.com

Comments on "Improving webpage performance"