Skip to main content

SEO – Effective Link Building Strategies for 2014

By SEO No Comments

The various ‘Penguin Updates’ introduced by Google since April 2012 have completely changed SEO forever.

Link building involving the creation of spammy links does not work anymore and that’s a good thing. To succeed with your website in 2014, you have to do the right things in the right way.

1. Link building is still relevant in 2014

You’ve probably seen a few “link building is dead” articles on the Internet. People who say this have usually only experienced poor quality link building in the past.  To some extent, however,  they are correct :

  • Automatically creating backlinks in bulk does not work anymore.
  • Creating bogus social network profiles to get backlinks does not work anymore.
  • Spamming forums and article websites with fake or low quality content to get backlinks does not work anymore.
  • Automated link networks do not work anymore.

In short: spamming does not work anymore – but good quality link building does!  Google’s Matt Cutts made that quite clear in an interview last July:

“Links are still the best way that we’ve found to discover [how relevant or important somebody is], and maybe over time social or authorship or other types of markup will give us a lot more information about that.”

2. Link Relevance is now essential

“Things, not strings” is one of the most important concepts that Google introduced last year. The context (relevance) of a link has become even more important.

The links that point to your website should come from pages that are related to the topic of the linked page in your own website. Google has been saying this for years but now they are penalising links that do not meet this criteria.

If all of the links that point to your website use exactly the same keyword phrase to forge the links, you can be fairly sure that this will trigger an ‘unnatural links’ warning on your Google Webmaster Tools account.  Conversely If the links to your website contain keywords that are related to the topic of your web page, Google will find your website relevant, and in some cases an authority for that topic.

3. Once again: spamming is risky

Some people still think that they can trick Google’s algorithm with the ‘brand new secret method that will get your site on Google’s first result page with just a few mouse clicks.’ We get daily emails offering a quick solution to SEO.  The fact is there is no quick solution!!  The spam tools used by these ‘black hat’ seo merchants do not work anymore.

Google has consistently said that it does not like link schemes and you should avoid the following types of links:

  • Buying or selling links that pass PageRank.
  • Excessive link exchanges.
  • Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links.
  • Using automated programs or services to create links to your site.
  • Text advertisements that pass PageRank.
  • Advertorials or native advertising where payment is received for articles that include links that pass PageRank.
  • Links with optimised anchor text in articles or press releases distributed on other sites.
  • Low-quality (free) directory or bookmark site links.
  • Links embedded in widgets that are distributed across various sites.
  • Widely distributed links in the footers of various sites.
  • Forum comments with optimised links in the post or signature

If you use these methods to build backlinks, you run a high risk that your website will be penalised by Google.

4. Make it personal

Links that require human intervention are the links that Google considers in the ranking algorithm. Quality is much more important than quantity.

A handful of high quality links are much better than hundreds of automatically created backlinks.

For more information on our professional SEO services please call us today on 01273 328877

Google: we won’t index your site if we cannot access your robots.txt file

By SEO No Comments

In an online discussion, Google’s Eric Kuan said that Google won’t index a website if they cannot access the robots.txt file of the site:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file.

If this isn’t happening frequently, then it’s probably a one off issue you won’t need to worry about. If it’s happening frequently or if you’re worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.

If you’re unsure about the technical details of your website, ask us to check your web pages or better still ask us to do a full SEO Audit of the site. Call us today on 01273 328877

SEO Case Study – Recovering from a Google Manual Action

By SEO No Comments


Towards the end of 2013 we received notification via our Google Webmaster Tools account that one of our long standing web development and traditional marketing clients had suffered a manual action by Google which effectively delisted their website pages from Google’s index – in other words their website was not appearing in the Google search results.


We ran a full Links audit and it quickly became apparent that the root cause of the manual action was a direct result of the ‘Penguin’ updates that looks closely at the relevancy and quality of all links pointing to websites. Google introduced the first Penguin update in April 2012 to try and stop websites spamming their way to the top of their results. Penguin penalises websites with low value, poor quality links from non-relevant sites, especially sites that have bought links in the past or are suspected of using ‘black-hat’ SEO link building techniques.

Our Links Audit revealed many links pointing to our clients site that were clearly ‘fabricated’ and had been created purely to try and gain search advantage. These links were mainly historic links created between 2011 and 2012 when link building was all about link volumes.

The client has previously enjoyed a high search engine ranking for almost all of his keyword phrases and the manual action was having a real effect on new business enquiry levels. Our priority was to try and have the poor quality links removed and have Google re-list the site.

Action Taken:

Using professional standard Links identification software we produced a definitive list of over 3,000 links pointing to our clients website. We then analysed each of these links in terms of quality and relevancy. We categorised all links into one of three segments:

a) Good Quality Links
b) Suspicious Links
c) Toxic or Bad links

We further analysed the suspicious links, erring on the side of caution. Once we had a definitive list of poor / bad links we wrote to each website (where possible) and asked for the links to be removed. After 7 days we sent a follow up letter / email repeating the request.

After a further 7 days had elapsed we submitted a ‘disavow’ request to Google for all those poor links that had not been removed. We had to be able to prove that we had made an effort to have the links removed otherwise our disavow request would fail.


After about 10 days Google advised that the manual penalty had been removed!

In December 2013 we started a new SEO programme for these clients – they had used other SEO agencies prior to ourselves – and after 1 month of white-hat ethical SEO we have seen them back in the top 2 pages of the Google results for 4 of their 6 primary keyword phrases.

If you have suffered a Google penalty or would like to know more about our SEO services for Brighton Businesses please call us today on 01273 328877