Part 1 – Defining the problem
Is optimising a website the same as decorating a rundown old house?
Obviously not, you would be a fool to think so, but in many ways there are some similarities and I’m not talking about the obvious pun of time to first paint!!
When I bought my first house I stretched myself financially to cover the mortgage repayments, leaving me with very little money to spend on anything else, and a lot of time at home because I couldn’t afford to go out. That was great because the house I bought was a wreck, so the extra time at home allowed me to embark on decorating the place myself.
When I moved into my latest home I opted to get professionals in to do any work that needed doing. Why? I no longer had any time (2 Children) and I had learned from my previous experience that DIY wasn’t always as cheap as you’d think. I couldn’t even walk into a hardware shop without spending £100. To paint a room I needed brushes, dust sheets, white spirit and masking tape before I’d even paid for the paint.
So when tasked with the job of optimising your website what approach will you take? The “cheap” time consuming DIY option or the professional automated version? In my experience over the last couple of years I have seen many companies provided with advice on how to optimise their site, yet very few have actually done so. Is it because they took the DIY option and are still doing it, or is it because they simply chose not to because they didn’t have the time or money?
5 of the most common issues with sites today have been highlighted below.
1. Too many large uncompressed images
Looking at the HTTP Archive, images account for 65% of an average page and 1120kB in size, so this is an obvious place to start. In addition to this large percentage of pages today, in the last year this number has increased by 27%. To be fair to website owners this doesn’t include what was loaded after the onLoad and also doesn’t break down what had already been optimised, it does however highlight the image heavy sites of today.
Making images as small as possible is even more important with the growth in mobile usage, so it is vital some strategy is in place. There are lots of free tools available such as jpegmini.com and smush.it but if you take an e-commerce site selling 20k products all with 5 images each then running 100k images through these tools is not feasible. So the DIY approach to this is going to need some software installed somewhere.
You could of course change your business process or educate the people loading the content, but suddenly this simple fix that someone told you about is no longer as straight forward as you’d hoped. To add to the complexity, a single image may need 3 different versions, one for desktop, one for mobile and one for tablet. Then there is the overhead in selecting the right version of the image. If you decide to change the HTML code to decide the appropriate image to serve then your developers will need to spend precious time on this, rather than concentrating on new innovations.
If you don’t have different versions of images then for mobile experiences you will likely be hiding or shrinking them. In our decorating analogy its equivalent to buying a painting for the wall and hanging it in your cupboard. If you take the shrinking option it could be the equivalent of making your living room ceiling the same as the Sistine chapel.
The final problem you will have is that if you optimise the images today, will there be new ones added tomorrow? Suddenly checking and making sure all images, and their device specific versions, are optimised becomes yet another task distracting you from your day job. There are of course solutions to help and it seems like an obvious job for an automated tool. Image servers would be one solution but there are obvious cost implications and you would need to make sure the solution was scalable and secure, the last thing you want to do is introduce a bottleneck or vulnerability in the delivery of your site.
Finally there’s the professionals, automated image compression and resizing on the fly in a highly scalable environment with great caching. When you add up the time it would take you and the headaches caused it might be the best money you’ve ever spent.
2. Too many objects
Reducing the number of objects reduces the number of round trips to the server and will always introduce a performance benefit, another simple rule that someone told you.
Let’s combine some files and start with CSS. Copying and pasting into a single file is easy, but does it end there?
You now have one massive CSS file, which to first time users’ needs to be downloaded on the first page they visit, if this takes too long to load and process they might not stick around long enough to see any benefits on the next few pages. This new file now holds the CSS for the whole site, every time you change it you should regression test the whole site. In addition only one developer can check it out at any given time, so development will slow down. You could start moving through your site and only combine the appropriate files for each page type but suddenly you have duplication of code split into multiple files as well as having to download that duplicated code for each page.
Suddenly the cheap DIY option doesn’t seem so good!!
As with images you could change your processes and develop with multiple files and make a build package with combined files, this will help but debugging code may now become a little harder. There’s also going to be a cost associated with this, either with money to buy a tool to do this or time in installing something or changing process.
Exactly the same applies for JavaScript with the additional complexities caused by the timing the code is run and any dependencies within it. You also don’t want to overload with unused JavaScript on pages that will cause blocking of other resources and the loading of the page.
Now DIY really doesn’t seem too good!!
3. Poor ordering of objects
Put your CSS at the top and JavaScript at the bottom is how the theory goes. Sounds simple enough until you realise you have some JavaScript that makes the menu work, or references to calls inline that need definitions present before they are parsed. But this rule is really important to get right because making sure things are in the right order will make sure the page is rendered quicker and your users will start to feel engaged with your page. In some instances you may even trick them into thinking your site is faster than it actually is if you can get some content to them straight away.
As mentioned above with combining JavaScript this may need to be carefully thought through for each major page type of your site. So the quick fix has gone out of the window again. Testing the best set up will also be time consuming.
3rd party content is often the cause for most website slowdowns and frustratingly is out of your control. Loading it too high up the page could effectively take down your site if the 3rd party slows or goes down. Load it too low and you may not get the benefit it is providing in the first place. Where possible 3rd party content should be loaded asynchronously to decouple their content from yours and reduce the risk to you, after all, your users won’t know it was not your content that made your site slow. Changing the way these 3rd parties load and the order in which they do so is yet another time consuming change to your site.
4. Poor caching of content
Static content on web sites very rarely changes and cache values should reflect this. Even in the DIY world this shouldn’t take long. But what about the CSS and JavaScript that will change with every release? Especially if you are now combining it into 1 file, that file will likely change with every release. Someone will tell you to continue to cache with a far future expiry and to include a version number in the file name. This is sound advice but it is also another change you will need to make to the deployment process. Again developers will need to spend their precious time making sure new version numbers of files are reflected in the new release. Not the biggest impact of all the issues mentioned here but the time still needs to be assigned and I am sure you would like your developers to be focusing on bigger and better things!
5. Everything served from the same domain
Serve content from multiple domains. Simple, or is it? How many domains do you need? The more domains you use the more hostnames you need to resolve and connections you need to open and maintain, so there will be a trade-off. What content will you serve from each domain? The aim will be to utilise as much of the available bandwidth as possible so depending on when objects are called may depend on what domain you want to serve them from. You will need to experiment to answer these questions and this will take time.
Once you’ve made the decisions above you are ready to go back and change the way you select content. If you were using relative paths before you will now need to include the domain in the links, more changes for your developers to make.
Are you thinking about SPDY or the imminent arrival of HTTP 2.0? Domain sharding is widely considered to decrease performance, so any decision you make and implement today should also consider the future, or the developers are going to have to go back and revisit again. Some browsers will un-shard for you but this may not be done by all, so you will need to consider the implications moving forward.
Summary
To summarise above, improving the performance of a website can sound like a simple task. However, no matter how well built the site is, it will require code changes to improve the performance if you want to do it yourself. Often, a development road map will not involve performance so any decision to move ahead and address performance will be at the cost of new features on a site. Worse still is when the development of your site is outsourced and any changes will incur direct costs. Looking back at the DIY analogy one of the biggest reasons I had for choosing professionals the second time around was that trying to decorate something myself would have to be squeezed around my day job, which would naturally stretch out the time to complete. This is true of any development team that has their own focus and targets. Choosing a professional may cost a little more in initial outlay, but you will save a whole load more time doing so, and the end result is likely to be much better.
With the exception of the image compression the solutions above all centre on finding time from a development team. There is unlikely to be any fixed costs outside of this so the question will be how much you value their time. The image compression issue could and probably should have a direct cost but how much you chose to spend on the solution will be determined by your budget. Don’t forget with many studies showing how increased performance can improve conversions, this optimisation may soon pay for itself multiple times over.
Over the next couple of months I will spend time with website owners to gather real world solutions to these problems as well as quantifying the time involved in solving them. Hopefully sharing how they have solved these common problems can help you too as well as quantify the true cost of DIY for optimising a website.