The idea for this article came to my mind when I was wondering why twitter.com is so slow sometimes. When looking at the source code of my profile homepage on Twitter I was really shocked because I still cannot understand why such a reputable company with so many users does not use all of the methods and processes available to increase the performance of their website.
But first things first. Before you’re going to analyze your website to find out what you can optimize to increase the performance of your website you need to install the following addons and Firefox of course.
I have also compiled a comprehensive list of Firefox addons for developers but you only need the addons listed above to follow the steps described in this article series.
1. Finding The (Maybe Not So Usual) Suspects
It is important that you empty your browser cache first.
Now just open the website in your Firefox you wish to speed up.
At the bottom of the browser in your status bar you will see information collected by the lori addon.
Screenshot of "Life Of Request" info
Time To First Byte
The first value shows you how long it took for the first byte to arrive. On this blog’s homepage it took just about 0.157 seconds. If you have dynamic data on your page and maybe even database queries on your site the value may actually be much higher.
This value is important to optimize your code on the server side. If you see something like more than 0.5 seconds or even a value greater than one second you know that you have to optimize your server-side code (more on that later).
Total Load Time
The second value shows how long it took until all of the elements on the page had been loaded – this includes embedded images and everything else – even external scripts and embedded video players. If this value is higher than about two to three seconds you know that you need to optimize the code on the client side.
Total Number Of Bytes
The third value shows the total number of bytes transferred which also includes images, scripts etc. Depending on your site and your target audience this value may be quite high but in most of the cases it should not exceed 200K (if your site is a blog it may be up to 500K in my opinion).
Number Of HTTP Requests
The fourth value shows you how many separate HTTP requests to a server were required. Your goal should be to minimize this number. At the time of writing there were 37 HTTP requests required on this page (and 5 requests on reload which is shown in the screenshot) which is quite a large value. Every HTTP request (especially if the request goes to different servers) results in a delay of your page being displayed.
There is no rule of thumb regarding this value – it depends on the nature and contents of your site but it should always be as low as possible.
2. Optimizing The Time To First Byte
If the time to first byte value is quite high you need to optimize your server-side code. As there is no general rule for all the different backends, applications and content management systems I can only give general recommendations here yet I will also show you how to decrease this value if you are using WordPress on your own server (not the hosted blogs at WordPress.com).
If the page you are trying to optimize is a completely static page (like a real HTML file) and the value is relatively high I would interpret this as your server being too slow or being under a high server load. You should check this with your webhost as a static page should be delivered by webservers in a very short amount of time.
If your page is a dynamic page developed in PHP, Perl, Ruby-On-Rails, JSP or any other web development language or system there may be much to optimize depending on the nature of your page.
Do you have many content on your website that is retrieved dynamically?
You may for example have the latest comments by users in your forum, on your blog or the latest images or postings on your page. If you are not caching these contents (or the page) on your server yet just ask yourself the following question:
What is the access-to-update ratio, i.e. how often is the specific page accessed by someone and how often do you change or update the contents?
If you have many people accessing your page yet you only update it hourly or even daily or weekly you should implement caching on your server. You may for example just update the contents on the page every X minutes from the database or you may use what I call intelligent caching where you only update cache files if the content has changed. This is actually what I am using on most of my own sites: when a new user joins the cache file containing the HTML code for the “new users” snippet is deleted. Whenever someone then accesses the page the PHP script on the server checks if the cache file is available and loads it in that case. If it is not available it is created just-in-time, cached and will be used as long as it exists.
Implementing a caching algorithm is however very dependent on the language your code is written in and depends on your framework or application so I cannot go into detail on that matter, regretfully, as there is not a one-fits-all solution.
I am using WordPress on this blog as well (as you might already have guessed) so I can give you some advice on how to optimize your pages.
I have seen many plugins for WordPress that have not been developed with getting lots of traffic in mind. Some retrieve the data from an external source (like your Twitter feed) on every single page request which not only increases load time it also means that your blog may be unavailable if Twitter (or any other source a plugin may be using) is down. This also leads to a huge amount of traffic to the external sources. You should check if the plugins you are using are requesting dynamic content from external sources on every single reload of a page.
Remember: whenever you access such a blog page data is retrieved from the external source before the page is displayed to the user (because it is server based). If these pages would be cached on the server for a limited amount of time this problem would be solved (to the most part).
There is actually a great caching plugin available for WordPress. It is called WP Super Cache and can be downloaded or installed directly from within WordPress. It decreases the load time of your WordPress pages dramatically because with the default installation WordPress pulls all article data on every request from the database so the database is really your bottleneck if you are getting lots of traffic to your site. WP Super Cache not only caches the pages it is also very intelligent in that it regenerates cache files automatically if you update your posts.
More info on WP Super Cache is available on this page as well.
This concludes the first part of the series. Continue to read the second part of this series.