Last week I was watching state of the index by Matt Cutts and he mentioned in the future Google may rank sites based on how quickly the page loads. So I decided to test my site.
I ran my site through Web Page Test and got some astonishing results. I failed on almost every account! Not having a lot of time there were some suggestions I was able to take an implement easily. The first was to enable gzip. I had always though I would have to install some fancy script to do this so I found excuses not to do it. Now I know that is not the case. It can be enabled quite easily if you have access to your php.ini file or if you just want to include this nice little snippet at the top of your page!
I was also told that I had too many css and js files included and it suggested combining them. Understandable, but how can one combine the core jQuery file and their main script file and have it still be maintainable? I compromised and combined my js script files that I wrote together.
I am not sure if it is because of the simple changes I made or if it is because I have been writing a little more frequently, but my ranking has improved when searching on my name. I usually do a search for ‘Levi Jackson’ to see where I pop up and currently I am number 3 when I had been 7 or lower for the longest time. Regardless it has made me more interested in optimizing my web pages than I have ever been before.
The current stats
I still fail many of the tests! What I do plan on working on next is to figure out a method of combining all of my js files or perhaps getting rid of some of them. At the same time I am going to work on a method of caching content so as to not require a call to the db unless there has been a change to a file since the last visit.
If page speed isn’t factored into the ranking yet… it will be. It is better to fix it now when it doesn’t play a role than to find your site dropping in ranking and need to rig something together to fix it then.