A Guide To Google Manual Penalty Removal & Recovery [Case Study]

Keep up with the very latest developments in the digital marketing world

By Dave Ashworth
on 4/2/14

In March of this year we took on a client whom had worked with an SEO agency previously and enjoyed good rankings, organic traffic and revenue.

However, on May 20th 2012 shortly after the first Penguin update they received a manual link warning from Google:



Over the course of 2012 and early 2013 no action was taken and subsequently rankings, traffic and revenue dropped, with the site losing around 8500 visits per month from non-paid search:



The first step was to review the links pointing to the site and decide the best course of action for identifying which ones we wanted to remove and keep.   The site not only had an English speaking site but also French, German, Italian, Portuguese and Spanish.  This not only meant we had to review sites that weren't in our native language, but also compose link removal emails with only a B in GCSE French to get us started.  A daunting task lay ahead.

The rest of this post documents each stage of the process we followed, details of each re-inclusion request and issues faced along the way.

Bear with me, it’s a long post, but so is the process of getting a manual penalty revoked.


Re-inclusion Request #1

The first thing we wanted to do was get as broad a view of the link profile as possible.  We did this by downloading all of the links from the following sources: Google Webmaster Tools, Open Site Explorer, Link Research Tools and Majestic.

This in total gave us 23,092 links.

We then used Link Research’s Detox Tool in order to filter out duplicate and site wide links and to obtain contact details for webmasters where possible.  The tool also breaks down links into one of three categories – Toxic, Suspicious and Healthy.  We didn't really find this categorisation useful as many suspicious/toxic links were classed as healthy and vice versa. This isn't a criticism of the product, for the price it provided exactly the service we wanted – to organise our links from multiple sources and give us contact details.

Once the detox was completed we were left with 2524 links to review manually.   The criteria was quite simple; if it looked crap, it more than likely was.  Exact match anchor text, directories and low quality articles are generally easy to spot, even if they are in a foreign language (for more on this, see this great article by Ian Lurie).

We took the CSV output from the detox and added columns to flag if a link was to be removed, if they had been contacted and if/when the link was removed.

Once this was complete (a task best saved for when you’re feeling at your least productive – hours of looking at directories and spun articles can be soul destroying and very 2009) we then had the task of contacting the webmasters, 1981 in total.

We then split the spreadsheet into 2 lists, those with contact email addresses and those without.  For those without, a manual email was put together as follows:

Dear webmaster,

My name is [insert your name here] and I am contacting you on behalf of [our client] as we are trying to recover from a Google penalty recently imposed upon their site. After a closer look we have discovered certain unsuitable links for our brand, which is why I am contacting you today.

Would you please be able to remove the following link from your site? I have provided details of the URL page, the external link to our site and the anchor text being used:

External link 1:
Anchor text 1: keyword rich phrase 1
External link 2:
Anchor text 2: another keyword rich phrase

Please let me know at your earlier convenience if this would be possible.
Kind Regards

We then used Google Translate so it would make sense for French, German, Italian, etc. webmasters (incidentally, the word “webmaster” is universal, so there you go).  We then went on the site to look for contact details or a form we could submit.

For those domains in the list where an email address was present, I wrote a script where you could upload the CSV and the sending of email was automated on a domain by domain basis – this worked a treat and saved us hours of man time that would have been spent laboriously creating and sending emails (almost as soul destroying as reviewing the links in the first place).  If you want a copy of this script, let me know in the comments below.

The responses were mixed.  Some removed them right away, some demanded cash (as you would in this post-Penguin world) with prices ranging from $1 per link to a guy who didn't know how to delete a link, but would sell us the domain for $200 and then we could do what we wanted with the site, whilst most didn't reply.

For the record, we didn't pay for any links to be removed.

After a couple of days, to give the webmasters a chance to reply, we documented all of the above; took copies of mails sent and received, added all this to a Google Spreadsheet and went about writing our first re-inclusion request.  From reading many “how to” guides on writing a request and watching various resources here and here, it became apparent that you couldn't skirt over the work done and needed to be thorough.  And apologetic.  And name names if you have to (I’m not into outing, unless there’s a paying client involved, sorry).

You can take a look at request #1 here

We submitted this on May 17th and got a reply on May 22nd. It was rejected.



This came as no surprise really, as to this day I have not read about a penalty being lifted on the first attempt.  I’d even read on a forum somewhere that the first two reviews were automated and it wasn't till the 3rd time that your request was manually reviewed! (I don’t know if this is true, but I've not heard of a 2nd time removal either).


Re-inclusion Request #2

We pretty much repeated the same process as first time round, but with stricter criteria applied to determine whether a link would be perceived as bad or not.  Some links had been missed first time around; some had keyword rich links but looked to be on “respectable” sites/blogs as opposed to content networks; and there were links that, whilst natural, didn't look great.

Request #2 can be viewed here

We submitted this on June 6th.

However, after a couple weeks we had not heard anything back, which was a surprise given how quickly the first one was responded to.  We put this down to Google staff being inundated with requests and sat tight.  After 3 or so weeks, I began to ask around various forums to get an idea of general turnaround time – the official guidelines say it could be several weeks, so I also asked if anyone had ever re-submitted a request. In response, some advised against this, suggesting it could actually work against you!

This advice was misguided and having come across the following video where Matt Cutts said a 2 month wait was not normal, I decided to resubmit the request.

This was resubmitted on July 12th and then on July 18th it was rejected.




Re-inclusion Request #3

Despite asking for examples of bad links affecting the site, the latest rejection simply informed us that we still had links which violated the guidelines.

By now we were left wondering just how far we had to go in removing links pointing to the site. Without any indication from Google as to what exactly was the issue, we were now having to be really strict with regards to what could be perceived as a bad link – we were also aware that we may now be removing potentially good links.

What is also important to note here is that we again downloaded links from all sources, and uploaded these to the detox tool along with our disavow file.   Because 2 months had now passed since we first downloaded all the links being reported, we checked the initial list against the new one and found there were now an additional 437 links that had not previously been reported. So, if your requests span over an extended period of time, bear in mind that new links are being picked up and reported all the time.

You can take a look at request #3 here.

This was submitted on July 29th and was swiftly rejected on August 1st.




Re-inclusion Request #4

By now, all the low quality automated spammy links had been dealt with and, like before, we were starting to second guess what were violating the guidelines; again, at the risk of removing perfectly good links.

We looked for site wide links, links duplicated across multiple domains, text in the vicinity that may look suspicious – e.g. “partner” or “advertise”  and the numerous SEO data sites that show domain stats such as or

The client site had a 3rd party payment and booking site that had footer links pointing back to the main site, so those were our first port of call.  A site wide link from an offline business partner (unpaid) was also removed, as was a Press Release from a German company whom our client had used – this PR was published across several duplicate content domains.  These and similar links were removed, disavowed and the 4th request was sent off.

Like before, we documented everything and if rejected requested examples of bad links to stop us from tearing our hair out.

You can take a look at request #4 here.

This was submitted on August 14th and rejected on August 19th. However, on this occasion we made a breakthrough to find that our efforts thus far had been thwarted by how Google reports links to a domain.




Re-inclusion Request #5 – A Breakthrough

Note on the above reply the different wording.

Before it would say: “Site violates Google’s quality guidelines”

Now it says “Links to your site violate Google’s quality guidelines”

This time Google gave us 2 examples of links. Both were blatantly manipulative and low quality articles on content networks, but we couldn't fathom how these links had been missed in our manual reviews.

We reviewed all our previous activity and found that these links WERE being reported in Google Webmaster Tools, WERE being uploaded into the detox tool BUT not coming out in the resulting filtered CSV.

Unsure of why this would be, we contacted Link Research Tools and were able to correspond with Christoph C. Cemper directly - whom I have to say was very helpful in sorting out the issue.

The issue arose from the fact that our client had more than one domain registered and that all of these were set up in Webmaster Tools as separate sites – of these, the main domain was a plural and another a singular domain.   The penalty had been applied to both domains but when we downloaded the links from the plural domain from GWT, the list included links to the singular domain because they were being 301'd.  GWT also only shows the linking page and not where it links to, so unless you check each link manually you wouldn't know the exact domain they were pointing to.

When we had performed a detox on the plural domain, 301'd links to the singular were filtered out as it essentially was not the same domain we had set the detox up on.

Therefore, from the 2nd request onwards we were chasing shadows looking for links pointing to the main domain that were potentially causing an issue.

With this in mind, it was very easy to replicate the steps taken previously in order to identify low quality links pointing to the singular domain and submit request #5.

You can take a look at request #5 here.

We submitted this request on August 20th and 6 days later… the manual penalty was revoked!





To double check that the singular domain was okay, we submitted a request for that to ask if it also had a penalty and within 6 days we were informed that the domain had had a manual action revoked too!



What Happened Next

Obviously, we were delighted that after 3-4 months of painstaking link removal we had got the manual penalty revoked; though we were a tad frustrated that it took so long due to the way in which our links were being reported.

There were a lot of lessons to learn from this and I hope readers will find this useful if they are facing a penalty or similar issues.

Ultimately, the work was worth it because since the penalty was lifted on August 26th the site has seen a 17.68% increase in organic traffic based on the same period before and a 22.91% increase in revenue.  The stats below compare August 26th - October 1st against July 6th -August 25th.






I hope that this case study has given help/hope to those struggling with a manual penalty – the advice that I would give is to be as meticulous as possible and to leave no stone unturned when cleaning up your link profile.  Document everything and be prepared to show everything you have done to Google.  I would also recommend a large degree of patience.

Good luck.

We can offer help and advice for anybody suffering from a penalty so feel free to get in touch for more details, or leave a comment below.
tips and tricks CTA