Articles, Tips

10 Ways to Analyze and Optimize a Site’s Performance

October 23, 2009 • By

Performance is one of its most important aspects of a website but it is often overlooked by both the designers and the developers. If a site takes too long to load the user will leave without giving other aspects of the site a chance. Every segment of code and every piece of the design should be created with performance in mind.

1. Observe How People Use Your Site

google-analytics-dashboard1

For your site to achieve its best possible performance, you need to understand exactly how the users use it and navigate around it. Some good performance-related questions that need to be investigated are the following:

  • Where are the most users landing? This question is important because this page needs to be among the fastest to load. Once a user lands on a site and their experience begins they become much less likely to leave. These pages should have minimal JavaScript and the other pages on the site can send bigger packages of JavaScripts.
  • Where are the most users leaving? Why are they leaving? Is it because that page is the slowest to load or have the users completed their goal? Try to understand what the users are experiencing. This page is your last chance to leave a lasting impression.
  • How many pages are the users loading per visit? If most users are only loading 1 or 2 pages, it might not be worth sending the user an entire image sprite or JavaScript functions that are used on other pages. If the users are looking at many pages, it might be a good idea to use AJAX to minimize page loads.
  • How many users have been to the site before and still have the page in their cache? In other words, will your site have a very loyal fan base and not many new visitors? This situation is rare but it lets you do more. It allows you to use bigger files, more Flash videos, and higher quality images.There are many tools out there for tracking your users. Google Analytics is the best free software that I’ve come across but Mint and Crazy Egg are great premium options. I find that Google Analytics and ClickHeat are a great duet for those on a small budget (in regards to both time and money). Please suggest more tools in the comments.

2. Take advantage of caching

img-01

The biggest bottleneck for websites is almost always the fact that it takes time to send files to the user’s computer and then wait for feedback. When this step can be skipped and the user’s browser does not need to request and complete a file transfer, the speed increases noticeably.

With modern sites that are more complex and more database-driven, caching becomes more difficult because the server cannot know when the page’s data has changed. While this problem can’t be avoided on all pages, some pages are static and can be easily cached.

That being said, even when a page does use a database for some of the content, that page can still be cached using some tools. Whenever the database is reloaded with new values for the page, these tools will go and retrieve the new page and then offer that up for caching. While this may not sound like the optimal solution, the load time of a site can increase dramatically for users.

I will explain this in more detail in point #7.

Splitting files appropriately is also important. If you have a page that optains a small percent of your visitors but that page contains a large amount of the JavasSript then that JavaScript should be split into two or more files. For example, Say a site has two JavaScript files: validation.js and rollover.js.

In this example, the page that 90% of people arrive at will only use rollover.js which is 14kb and another page on the site that receives 10% of the visitors will use validation.js which is 26kb. If those JavaScript files are merged into one, the users will only need to make one request, but the file will be 40kb and only 10% of the users will end up using the whole thing.

This is why it’s very important to analyze what your users are doing and if it’s optimal to send them everything.

3. YSlow

img-02

YSlow is a great firefox addon that anyone who works with websites should use. YSlow will analyze a site and give it a letter grade based on its performance. YSlow can be pretty strict, so don’t get too down if your site doesn’t obtain an A rating right away, but it’s a great tool and it will raise your awareness about where your site fails and where it succeeds.

Firebug is required for YSlow, but you should really have it already. It’s insanely useful (unless you’re trying to debug an IE bug).

4. Minimize JavaScript and CSS

img-03

Even the smallest files matter when servicing millions of visitors a month.

xkcd, an incredibly popular web comic, was getting 70 million pageviews per month in October 2007. This number has likely grown significantly since then. xkcd’s CSS file is 3.89 KB.

If that file was sent out for every pageview (which it isn’t due to caching, but let’s just pretend for the sake of this example), that would be almost 260 gigabytes of bandwidth per month.

Quickly putting that code into CSS Optimizer creates a file that is 2.39 KB. The bandwidth per month would decrease to around 160 gigabytes.
Clean CSS shrinks CSS code, but still retains its readability. It optimized xkcd’s CSS file to 3.272 KB.

That all being said, JavaScript files are usually much bigger than CSS files, especially when there are lots of jQuery plugins and YUI components. Luckily there are some great JavaScript “minifiers”.

YUI Compressor is definitely one of the best compressors but it’s not incredibly easy to use for people who aren’t experienced with command line programs. JSMin and Dean Edward’s Packer are also worth mentioning.

5. Decrease Image Sizes

 img-04

Images usually make up the bulk of a website. The delicate balance between quality and image size has been discussed thoroughly. In the scope of this article I will only be able to scratch its surface but I will point you to other valuable resources at the end of the article. I strongly recommend reading them.

Designers will sometimes get into the habit of saving their images as 80% quality jpegs without looking at alternatives. Here are some very quick tips that can optimize images quickly and without going too in-depth.

  • First of all, save your image in both a flattened PNG format and as JPEG at 80% quality and then compare them. The PNG file will usually have a slightly larger file size but it may look significantly nicer. From here you can decide which option is the best and tweak the JPEG’s quality percent appropriately.
  • Now that you’ve decided to go with jpeg and you’ve tweaked your image to a quality setting that you’re happy with, let me ruin everything and tell you about a great little (command line) program that shrinks PNG files. Pngcrush doesn’t always make pngs smaller than jpgs, but it makes the competition much more interesting. After you run this on the PNG of your image you will be able to re-evaluate whether the JPEG file size and quality are better than the PNG’s.
  • With both images loaded on a server somewhere, run Yahoo’s Smush.it App on both images. Smush.it does not reduce the image quality in any way, it removes unnecessary bytes and trims off anywhere from 1% to 45% off the image size. At this point you should be able to pick which image format you want to use.
  • The final step for this process is to put some images into a sprite. This means to put images into the same file and then use CSS to reference the image’s location in the file whenever that image is used. By doing this, the browser will need to make fewer requests and usually the size of the sprite is smaller than combined size of all the images. A great tool for creating sprites is the CSS Sprites Generator. Be careful when creating sprites, as I mentioned before, the user only wants what they’ll need. If they’re only going to visit one page, sending them a sprite that contains every image in your whole site will slow down and hurt their overall experience. Separate your images into sprites according to how the average user visits and navigates around your site.

6. Lori

Lori is a really simple Firefox addon that displays some vital stats in the status bar. It displays:

  • Time for first byte: The number of seconds from when the browser sends the first request until when the first byte is returned.
  • Time to complete: The number of seconds from when the browser sends the first request until when the last byte is returned.
  • Page Size: How big the page is.
  • Number of Requests: How many requests the browser sent.

lori

It’s a good idea to get into the habit of watching the numbers of various sites that you visit.

7. Dynamic Serverside Scripting and Database Requests

img-05

Server-side dynamic webpages, such as those written in PHP, ASP, Perl, JSP, and so on, will first execute code to build a file that can be viewed in a web browser. While executing this code, the script will sometimes send queries to the database to retrieve the content for the page. This introduces an extra step that does not exist for static pages written in simple HTML.

It is unnecessary for the scripts to repeatedly execute the same code for each unique user when the exact same file is generated every time. The optimal case would be for the server to check if the needed HTML file has been generated yet, and if so, send that instead of generating a new one. Luckily, there are existing tools that accomplish this.

I apologize that I’m only mentioning tools that use PHP, but they are the only decent and free tools available. If you know of any that I’m missing, please do not hesitate to write about them in the comments.

  • ADOdb Adodb is a database abstraction library for PHP. Database abstraction is an extra “layer” between the scripting language (in this case, PHP) and the database. This layer is responsible for checking if the results of the current query have been recently retrieved, if yes, it gets those results from its cache instead of generating new ones. This ends up being much quicker for queries that get executed often and only slightly slower for queries that are executed very rarely.
  • APC and eAccelerator These are two optimizers that cache compiled PHP files. They will make your PHP code respond to your users’ requests faster and lessen your server’s execution time. If you have the time and the resources, I recommend that you try both of them and track how they each perform.

8. Show the Page While Other Parts Load

When a user loads a blog post or an article, they’ll probably read a chunk of the article before reading the comments or looking for secondary content. To take advantage of this, you can load the secondary content after the primary content has been loaded. You can add this functionality to your page with AJAX. Chances are that your users won’t even be able to scroll down to the comments section before it loads.

lifehacker

9. Take Advantage of AJAX

 img-06

AJAX has more uses than just loading comments. It’s great for validating form fields against the database as the user inputs their values. It’s really annoying when a form comes back because the username has already been chosen and then the user has to rewrite their password and figure out the CAPCHA again. Instead, use AJAX to give feedback to the user immediately and then they can be certain that they will only need to submit the form once.

Other great uses for AJAX include voting/rating systems, logging a user in, auto-complete for text inputs, updating RSS feeds, sorting lists, and many more.

10. Gzip Compression

windows_7_winzip_07 (2)

When a file has been zipped, it means that the file has been compressed and Gzip is simply a free, open source version of the zip technology. Compressing a file can reduce its size significantly. The text of this blog post is roughly 16kb uncompressed and 7kb after gzipping. Turning on gzip compression can be a complicated task for people who are not experienced with changing Apache’s settings. I would recommend that you email your web host and ask if they could turn this on.

Further Reading

(Visited 159 times, 1 visits today)