Skip to main content

Google: we won’t index your site if we cannot access your robots.txt file

By January 2, 2014January 9th, 2014SEO

In an online discussion, Google’s Eric Kuan said that Google won’t index a website if they cannot access the robots.txt file of the site:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file.

If this isn’t happening frequently, then it’s probably a one off issue you won’t need to worry about. If it’s happening frequently or if you’re worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.

If you’re unsure about the technical details of your website, ask us to check your web pages or better still ask us to do a full SEO Audit of the site. Call us today on 01273 328877