Is a crawl-delay rule ignored by Googlebot?
Articles,  Blog

Is a crawl-delay rule ignored by Googlebot?


Today’s question comes from Noida in
India. Amit asks is the crawl delay rule ignored by Googlebot I’m getting a
warning message in search console search console has a great tool to test
robots.txt files which is where we Cho this morning what does it mean and what
do you need to do the crawl delay directive for robots.txt files was
introduced by other search engines in the early days the idea was that
webmasters could specify how many seconds a crawler would wait between
requests to help limit the load on a web server that’s not a bad idea overall
however it turns out that servers are really quite dynamic and sticking to a
single period between requests doesn’t really make sense the value given there
is a number of seconds between requests which is not that useful now that most
servers are able to handle so much more traffic per second instead of the crawl
delay directive we decided to automatically adjust our crawling based
on how your server reacts so if we see a server errors or we see that the server
is getting slower we’ll back off on our crawling additionally we have a way of
giving us feedback on our crawling directly in search console so site
owners can let us know about their preferred changes in crawling with that
if we see this directive in your robots.txt file we’ll try to let you
know that this is something that we don’t support of course if there are
parts of your website that you don’t want to have crawled at all
letting us know about that in the robots.txt file is fine check out the
documentation in the links below we love sharing these videos with you so
please like and comment if you’re enjoying them and don’t forget you can
subscribe to their channel for the latest video updates

9 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *