Skip to main content

Google: we won’t index your site if we cannot access your robots.txt file

By SEO No Comments

In an online discussion, Google’s Eric Kuan said that Google won’t index a website if they cannot access the robots.txt file of the site:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file.

If this isn’t happening frequently, then it’s probably a one off issue you won’t need to worry about. If it’s happening frequently or if you’re worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.

If you’re unsure about the technical details of your website, ask us to check your web pages or better still ask us to do a full SEO Audit of the site. Call us today on 01273 328877

SEO Case Study – Recovering from a Google Manual Action

By SEO No Comments

Situation:

Towards the end of 2013 we received notification via our Google Webmaster Tools account that one of our long standing web development and traditional marketing clients had suffered a manual action by Google which effectively delisted their website pages from Google’s index – in other words their website was not appearing in the Google search results.

Task:

We ran a full Links audit and it quickly became apparent that the root cause of the manual action was a direct result of the ‘Penguin’ updates that looks closely at the relevancy and quality of all links pointing to websites. Google introduced the first Penguin update in April 2012 to try and stop websites spamming their way to the top of their results. Penguin penalises websites with low value, poor quality links from non-relevant sites, especially sites that have bought links in the past or are suspected of using ‘black-hat’ SEO link building techniques.

Our Links Audit revealed many links pointing to our clients site that were clearly ‘fabricated’ and had been created purely to try and gain search advantage. These links were mainly historic links created between 2011 and 2012 when link building was all about link volumes.

The client has previously enjoyed a high search engine ranking for almost all of his keyword phrases and the manual action was having a real effect on new business enquiry levels. Our priority was to try and have the poor quality links removed and have Google re-list the site.

Action Taken:

Using professional standard Links identification software we produced a definitive list of over 3,000 links pointing to our clients website. We then analysed each of these links in terms of quality and relevancy. We categorised all links into one of three segments:

a) Good Quality Links
b) Suspicious Links
c) Toxic or Bad links

We further analysed the suspicious links, erring on the side of caution. Once we had a definitive list of poor / bad links we wrote to each website (where possible) and asked for the links to be removed. After 7 days we sent a follow up letter / email repeating the request.

After a further 7 days had elapsed we submitted a ‘disavow’ request to Google for all those poor links that had not been removed. We had to be able to prove that we had made an effort to have the links removed otherwise our disavow request would fail.

Result:

After about 10 days Google advised that the manual penalty had been removed!

In December 2013 we started a new SEO programme for these clients – they had used other SEO agencies prior to ourselves – and after 1 month of white-hat ethical SEO we have seen them back in the top 2 pages of the Google results for 4 of their 6 primary keyword phrases.

If you have suffered a Google penalty or would like to know more about our SEO services for Brighton Businesses please call us today on 01273 328877