Everyone reacts differently to the news that their site has taken an unexpected and sudden slide down the search engine rankings. Some get mad, some get sad, and some simply go into denial, reassuring themselves that they will “bounce back” given enough time.
The truth is that whatever you feel about your drop in rankings (and whether you think it was justified or not) actually matters very little. The only thing that will get your site back into a profitable position is proper diagnosis of the problem combined with well thought out and properly implemented recovery plan.
Diagnosing the problem correctly is very important. This stage of the recovery process will inform any decisions going forward. Getting it wrong could mean your “fix” damages the positive elements of your SEO while leaving the negative elements, the parts that got you penalised in the first place, untouched.
With this in mind, here we take a detailed look at ten steps to diagnose the issues behind a sudden or slow decline in search rankings.
1. Check for sudden drops
The first thing you should determine is whether or not your drop in rankings happened suddenly or if it was a slow decline.
A sudden drop – and we mean literally overnight – is typically indicative of a manual penalty or an algorithm change. Google Webmaster Tools should be your first port of call to differentiate between the two.
Navigate to your site’s account on Webmaster Tools and check under the “manual actions” page in the “search traffic” section. If there is a new message then, unfortunately, you have been hit with a manual penalty. Don’t despair though, although this is still bad news for your site, it will usually make diagnosing the issue much easier. The message should detail what Google has found a problem with, giving you a head start towards getting it fixed.
Another possible explanation for a sudden drop in rankings would be an algorithm change. Determining when your site started to lose rank can provide clues as to why it might have been hit and how you can start to recover.
- In May 2014 Google rolled out Panda 4.0, an algorithm update designed to phase out “thin content” from the top search results. If your traffic and rankings dropped during this time then providing more substantial and unique content should be your first priority.
- May 2013 saw Penguin 2.0 launched. This update targeted web-spam and bad link building practices, so recovery would have needed a different approach.
To find out if the drop in traffic on your site coincided with an algorithm change, check the Algorithm Change History on Moz or visit Google’s support forums.
A steady drop in rankings is much harder to diagnose as it could be caused by any number of issues. If you have ruled out any of the above causes, work through the list below until you discover the issue.
2. Check the quality of your backlinks
The quality of your links pointing to your site could be the cause of your problems. Take a look at your inbound link profile using Webmaster Tools, Majestic SEO or Open Site Explorer.
There are two things to look out for:
- Quality of linking sites
- Relevance of the page linking to your site
The quality of the site refers to the perceived authority of the site in Google’s eyes. For the task at hand, this means looking for sites with a very low domain authority, sites that cover irrelevant topics, and sites that contain thin content. Google takes inbound link validity and quality very seriously, so be sure you haven’t been caught out by any dodgy “black hat” link building techniques.
3. Analyse the anchor text in your links
If you don’t believe that you have any backlinks from low quality or “spammy” sites, it still doesn’t mean that your backlinks aren’t the cause of your search ranking woes.
Over-optimised links that use keyword-rich anchor text were targeted by the Penguin update last year. The main link profiles targeted by this update were:
- Too many deep links to a specific page
- Too many links using the exact same anchor text
- Too high a percentage of keyword-rich links
- Too few branded links
An example of a link that would have typically been targeted by the Penguin update would be one using anchor text such as “used cars in Manchester” or “wholesale electronic cigarettes” and pointing to a specific page deep within the site.
A small percentage of links like this can be beneficial (below 10%) but the majority of your links should be focused on your brand name or generic terms like “click here” or “read more”.
4. Check your link growth profile
Although a solid foundation of backlinks can keep you atop the search engine pile for months or even years, in highly crowded industries it will take a consistent flow of quality inbound links to sustain a competitive advantage.
Check your competitors’ link profiles and see if they are receiving links at a steady pace. If they are, but you aren’t, then this could be a big part of your problem.
5. Check for duplicate content
While you might not have been posting identical content on multiple pages of your site on purpose, blogs and eCommerce stores have a tendency to repeat content because of the way they are structured.
For example, an eCommerce store could have a category page with a list of products and multiple ways for customers to sort and filter that list. If each filter results in its own separate page URL (as it commonly does) then Google will index this content multiple times.
A quick way to identify a content duplication issue is to log onto Webmaster Tools and look at the HTML Improvements report. Warning signs include:
- Duplicate title tags
- Duplicate Meta descriptions
If you think your site is submitting multiple identical versions of content to Google, this could make pages on your site compete with themselves, devalue links to your main page and eventually be the cause of your fall in rankings.
6. Ensure you’re not over-optimising
In the last few years, Google has been cracking down on sites that are overly optimised. Over-optimisation refers to the excessive use of historical SEO techniques to boost organic search rankings. This can include:
- Overuse of keywords within the content
- Exact match domains
- Keyword-rich heading tags
- Overuse of keyword-rich internal links
- Unnecessary use of the H1 tag
All of these and a number of other similar factors could contribute to a steady fall in search rankings.
7. Determine whether your content is “thin”
While you might think that the phrase “thin content” refers exclusively to sites that have very little indexable content on it (100 words of text per page for example), Google also sees sites that don’t offer value to the user as purveyors of thin content.
What this means is that if your site is full of well-written and informative articles, but the articles are on topics which have been covered a thousand times before (“The best way to clean your car”) then you run the risk of losing your positions in Google’s search results pages.
8. Verify that all your content is indexed
While this might seem obvious, it can be a simple fix that is often overlooked. If Google’s crawler suddenly can’t find or access parts of your website that it used to have no trouble accessing before, it takes as little as 48 for those pages to be dropped from the their index.
Content might no longer be indexed if:
- A robots.txt file has been wrongly coded
- Server settings are incorrectly redirecting a certain page
- The page has been mistakenly deleted
- URLs are being wrongly rewritten
- Pages have been moved without appropriate redirection
Try and think back to any changes you have made to the site recently, you or someone else working on the site could have inadvertently modified a file to hinder proper indexing. If you use CMS plugins such as Yoast’s WordPress SEO then it is possible to close your site to search engine crawlers with one simple click, so be wary when making changes.
9. Check whether Google’s ranking your pages appropriately
It is important to check that Google is ranking pages within your site as you expect it to. In other words, if Google considers page X to be more relevant to a keyword than page Y, while you expect page Y to be the one that’s optimised for that keyword, then you have a problem.
If Google is getting the wrong end of the stick with the way it comprehends your content then it will more than likely have a negative effect on your search visibility.
The simplest way to check is to use the “site:” function with Google search itself. Enter your website’s URL in the following format:
- site:www.yourdomain.com keyword
After the ‘site:’ command with your domain, put a single space and then enter one of your targeted keywords. Is it returning relevant content from your site as the top result? If not, then pages on your site are cannibalizing your own ranking and it isn’t clear to search engines exactly what the content on each page is about.
10. Analyse the competition
Looking at your competitors’ websites, particularly those that rank well, will help you to spot elements your own site might be missing. Ask yourself these questions:
- Is their content short or is it a lot of long-form guides and essays?
- Do they use Schema mark-up or rich snippets?
- What does their “above the fold” content look like?
Is it loaded with banners and images or is it mostly text-based.
- Is their website updated more regularly than yours?
- Do they have a lot of social signals on their page?
- How is the site structured?
Do they have a separate set of pages for each product or service offered or is it all on a single page?
- How are their URLs formatted?
Do they give a better indication of what is on each page than your own do?
Hopefully you now have a clearer picture of what has caused your business to lose its valuable position on Google’s search result pages. If you are still struggling to work out what went wrong, or want help with recovering as soon as possible, be sure to get in touch: