Should I disallow Googlebot from crawling slower pages?
Articles,  Blog

Should I disallow Googlebot from crawling slower pages?


Today, we have a question
from London. Tommo wants to know, you
mentioned that site speed is a factor in ranking. On some pages, our site uses
complex queries to return the user’s request, giving
a slow page time. Should we not allow Googlebot
to index these pages to improve our overall
site speed? Really interesting question. I would say, in general, I would
let Googlebot crawl the same pages that users see. The rule of thumb is this. Only something like 1 out of 100
searches are affected by our page speed mechanism that
says, things that are too slow rank lower. And if it’s 1 out of a 100
searches, that’s 1 out of roughly 1,000 websites. So if you really think that you
might be in the 1 out of 1,000, that you’re the slowest,
then maybe that’s something to consider. But in general, most of the
time, as long as your browser isn’t timing out, as long as
it’s not starting to be flaky, you should be in relatively
good shape. You might, however, think about
the user experience. If users have to wait 8, 9, 10,
20 seconds in order to get a page back, a lot of people
don’t stick around that long. So there’s a lot of people that
will do things like cache results and then compute
them on the fly later. And you can fold in
the new results. But if it’s at all possible to
pre-compute the results, or cache them, or do some sort of
way to speed things up, that’s great for users. Typically, as long as there is
just a few number of pages that are very slow or if the
site overall is fast, it’s not the kind of thing that you
need to worry about. So you might want to pay
attention to making it faster just for the user experience. But it sounds like I wouldn’t
necessarily block those slower pages out from Googlebot unless
you’re worried that you’re in one of those 1 out
of a 1,000, where you’re really, really the outlier in
terms of not being the fastest possible site.

6 Comments

  • Dave

    Hey Matt,

    What exactly is "slow" from Google's perspective? 1s? 5s? At what point might one worry?

    From a user-experience perspective we're reasonably happy as only very specific queries are slow and they contain real-time data that can't be precomputed, but we're considering keeping Googlebot away from the output entirely if one extremely slow script might hurt the ranking of the entire site.

  • Wouter Blom

    @hireahitCA Did you watch the video? Matt does answer your question. From a user perspective you can test it in google analytics and it is about 3.5 seconds for US and europe.
    If you are 400 milliseconds slower than your competitor, the user is already prone to leave your website.

  • Manuel Cheta

    Matt mentioned "not timing out". Anyways, you might want to keep it under 3-4s. More than that and I won't come back to your site.

  • Spook SEO

    A good rule of thumb is to think about the user experience. If its at a point where you think the users will find it long enough to make it annoying, then you've got to do something about it. If not, then you should be OK.

Leave a Reply

Your email address will not be published. Required fields are marked *