Google couldn’t crawl your site because we were unable to access your site’s robots.txt file

How many are getting this warning:

Google couldn’t crawl your site because we were unable to access your site’s robots.txt file

But your robots.txt IS accessible and when you check the access logs every request by googlebot has returned 200 success. Other crawlers have no issues accessing the file and crawling.What are we supposed to do? This is becoming a major problem for us with no way to troubleshoot. You have not received a single customer complaint accessing the site. Everything is working and you are seeing in analytics that traffic is okay.

I get the warnings too. I dug and dug. What I cam up with is that if Googlebot is crawling the site and for any reason that the server switches to a backup server for maintenance to share the load you will get the message. Yeah, I know it sounds stupid but after 4 months of trying to figure out why I was getting that message and testing I found that it worked if I left the site on one server and did not send it to back up. As soon as I let it backup on the other server to switch for a maintenance mode I got the message. So now I just ignore the warnings. I look at analytics and make sure my traffic is not taking a nosedive.

So while we should be concerned about this I have given up and I am ignoring it. If my analytics are showing that I am getting organic traffic then I am okay and I am not gonna pay attention to this message.

Comments are closed.