Jump to content

It’s Official: Google now counts Site Speed as a Ranking Factor


mike_stemberg

Recommended Posts

<p>There is no real question from me here, but just a bit of info for those who are interested in SEO rankings and such like:<br>

First a bit of analysis about it ~ <a href="http://searchengineland.com/google-now-counts-site-speed-as-ranking-factor-39708">(link):It's Official: Google now counts site speed as a ranking factor.</a>,</p>

<p>and here: the official webpage from the folks at Google ~<a href="http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.html"> (link): Using site speed in web search ranking.</a></p>

<p>Makes one think again about some of those e.g. bulky music-weilding, fat and heavy Flash based photo-gallery sites, no matter how attractive they might seem. Relevance is still important but everything takes a slight back-seat to another factor now.</p>

Link to comment
Share on other sites

<p>Though I have no personal interest in one at the moment, the new iPad may do wonders for this.<br /><br />All of those hip, young, disposable-income-having brides/grooms will be sitting in the coffee shop with their iPads ... <em>not </em>being able to see the slow, Flash-powered web sites of many wedding photographers. It matters folks. Always has, and even more so, now.</p>
Link to comment
Share on other sites

<p>If you run Google's site speed tool, you will see that flash doesn't appear to impact their site speed rating at all. Of course, that actually requires doing some research, which isn't the way things are done by most internet users, but it appears that they look at how fast the page loads, not how fast the content inside page elements loads. This page ranks as slower on google's tool than numerous flash sites I tried.</p>
Link to comment
Share on other sites

<p>Here's something else. You can completely jack the system by treating requests from googlebot differently than requests from anywhere else. Google itself says this in a page on measuring site speed:</p>

<blockquote>

<p>if the servers return slightly modified content for Googlebot than they would for normal users, this may affect what is shown here.</p>

</blockquote>

<p>What that means is that you can place high in site speed by returning a blank page when the http request comes from googlebot while sourcing the usual page to everyone else. It won't take long for every SEO specialist on earth to figure this out.</p>

Link to comment
Share on other sites

<p>Indeed. Anything the SEO tweekers can think of, the Google crawler programmers can think of. Google <em>also</em> crawls random pages using a user agent that mimics Internet Explorer, Firefox, Chrome, Opera, phones, etc. They are able to detect when people serve them pages that meant to please them, but which aren't the same for normal visitors. Sometimes there are legitimate reasons for that, and sometimes it's just sleazy. They're pretty good at knowing the difference.</p>
Link to comment
Share on other sites

<p>It's actually very difficult with a blank page. You can serve up tiny blank images and google doesn't know. Alternatively, you can post a page for the googlebot that is filled with text, that will load quickly. Google can't discount it. It's not that hard to outwit the googlebot, it's a program and someone has to change it for it to behave differently. It's not like there's a thousand people sitting there watching the pages as the bot goes through them. Google hasn't been able to detect when a different page is served to the bot, and can't unless they change their IP addresses. And people figure that out very quickly. Human ingenuity can outwit automation.</p>
Link to comment
Share on other sites

<p>What I'm referring to, Jeff, is that while the Googlebot announces itself <em>as </em>the Googlebot when it calls for a page (and you can, indeed, use server-side scripting to look at that request, and serve back something tailored to make Google happy with you) ... Google ALSO runs automated processes from different platforms, connecting from different IP addresses, which use similulated (embedded) user-standard browsers to fetch the same pages on occasion.<br /><br />Unlike the normal Googlebot crawler, these hits are indistinguishable from normal human visitor traffic. The bot-operated browser (pretending to be a Mac running Safari one day, pretending to be Opera on a smartphone the next visit, and pretending to be IE7 on XP later) will - unlike the Googlebot - actually execute javascripts, allow DHTML to fly objects around, etc ... and Google's index <em>does</em> note when those pages take substantially longer to fetch/load than the bot-optimzed pages the SEO people work up for the normal crawler. <br /><br />Unless you know which IP addresses to watch for (and they rotate them, on different carriers), you - as a webmaster - have no way to distinguish those automated sim-browser hits from normal human traffic. No dummies, the Google guys. And if their automated comparison between the normal crawler output and the simmed browsers detects a too-aggressive attempt to appear fast to the 'bot while loading up all sorts of visit-killing crap to normal visitors, they will definitely spank your page ranking.</p>
Link to comment
Share on other sites

<p>People have been able to beat the googlebot.<br>

<br />BTW, lots of sites change dependent on IP, including google's. They can't really determine which are done to accommodate different types of usage and which are spoofing the googlebot. Also, google can't really test php scripts, only the result. </p>

<p>It's easy to see how lame google's indexing really is when parked pages show up at the top of the search so often. It's obvious they aren't really checking anything most of the time since the parked pages have almost no content and normally wouldn't get any traffic if google didn't get spoofed out by the owners of parked domains.</p>

Link to comment
Share on other sites

<p>Hey, I didn't say I <em>liked</em> how they index things - and they are definitely losing out to some badly transparent (to me, anyway) junk site search spamming tactics. I know they have to choose their battles, what with untold billions of pages to index ... but sometimes their laser-like focus in some areas seems to come at the expense of some really BS rankings. Obviously. <br /><br />And I know that sites change output based on IP, because I handle the hosting and a lot of behind-the-scenes work on some very, very visible web sites (some national brand names you'd know, and some high-profile non-profits). My point is that there's dynamically rendering content for all of the right reasons, and there's doing it while trying to be slippery to boost Googlerank. I'd rather not take the chance of being suppressed, just to game their system by a hair's worth. It's too risky. Legit changes to page rendering based on agent, IP address, time of day, cookies, and any number of other factors - it's the way things are done. But ... serving blank pages to "fool" Google into thinking your site is fast? Not a good strategy. It will come back to bite very quickly.</p>
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...