Concerns About Page Speed and Tips for Improving it
About a month ago, WebProNews interviewed Google’s Matt Cutts, who suggested that page speed may soon become a ranking factor in the world’s most popular search engine. Speed has been a consistent theme with the company over the past year or so, with the release of various tools and announcements. It has become quite evident that Google places a great deal of importance on speeding up the web. With that in mind, it’s not hard to see why Cutts’ suggestion could soon become a reality. Google has always maintained that it is trying to deliver the best user experience, and by delivering results that load quickly users should get just that.
While many webmasters are embracing the notion of speed as a ranking factor as a welcome change, there are also plenty of people who do take issue with it for a variety of reasons. We’ve had some interesting comments from readers on the subject. Here are some of them:
So, we all have to pay for the most expensive hosting now or we won’t get found in search engines. I won’t be able to host on my own servers at work now. It went from paying for backlinks with huge advertising corporations to get sites PageRank up, Now we have to go with even bigger corporations that can afford to have a massive pipe connecting to the Internet. I don’t think Google mean to, but they are squeesing the poor people of the World out from search results and glorifying huge corporations â€“ Be careful Google!
Page speed is going to be a big political issue. Apart from concerns about net neutrality, what about countries who’s internet infrastructure is vastly inferior to the technology rich countries. Regions like south east asia and central china have much better connections than east africa. Even some parts of Scotland have poor internet links based on the ageing BT networks. Also the people who can afford dedicated servers and high quality bandwidth have a big advantage over the common Joe who has to rely on shared hosting. Does this make google less democratic? or are they just following what they think people want, ie faster loading sites?
What do you think will happen to the sites that are mainly using rich media like video blogs? Can they really accelerate their load time? If not, are they doomed to drop from the SERP?
The speed thing concerns me. Next to a tiered internet its the biggest slam agains the small time net player. Corporations will take over fast and knock out anyone who can’t afford a lightning fast server.
Regardless of how you feel about the possibility of Google using page speed as a ranking factor, it’s probably going to happen, and it’s something you’re more than likely going to have to deal with. Besides this even being a factor for regular organic results, consider Google’s recently introduced real-time results. The quicker Google can crawl you, the quicker you can potentially appear in this section.
As far as speeding up your site in general, Bill Hartzer recently shared a few tips on the subject in an interview with WebProNews:
And of course, Google has its own tips. The company offered a few on site performance improvement using its Webmaster Tools. Webmaster Tools has a Site Performance feature, which shows you a performance overview graph. This looks at the aggregated speed numbers for your site, based on the pages that were most frequently accessed by visitors who use the Google Toolbar and have the PageRank feature activated.
“By using data from Google Toolbar users, you don’t have to worry about us testing your site from a location that your users do not use,” explains John Mueller, Webmaster Trends Analyst, Google ZÃ¼rich. “For example, if your site is in Germany and all your users are in Germany, the chart will reflect the load time as seen in Germany. Similarly, if your users mostly use dial-up connections (or high-speed broadband), that would be reflected in these numbers as well. If only a few visitors of your site use the Google Toolbar, we may not be able to show this data in Webmaster Tools.”
There is also a section that shows you some examples of pages and the average, aggregated load times that users observed while they were on your site. “These numbers may differ from what you see as they can come from a variety of different browsers, internet connections and locations. This list can help you to recognize pages which take longer than average to load â€” pages that slow your users down,” says Mueller. “As the page load times are based on actual accesses made by your users, it’s possible that it includes pages which are disallowed from crawling. While Googlebot will not be able to crawl disallowed pages, they may be a significant part of your site’s user experience.”
Google recommends that you watch the load times over a short period of time to see what’s stable, because you may see spikes here and there. If you consistently see high load times, that is probably representative of what most people see.
There is also a section that gives you Page Speed Suggestions. It gives you some example pages from your site and suggestions on how to optimize those specific pages. The suggestions are based on Google’s Page Speed Firefox/Firebug plug-in.
Google give more information on each of these features here.
Sites aren’t the only things Google places emphasis on speed with. Last week, Google launched a new extension for Chrome, which lets developers identify performance problems with their web apps too. The tool is called Speed Tracer, and it uses a “sluggishness graph” combined with other metrics to help users pinpoint the problems that are slowing their web apps down. You can read more on that here.