A major update by Google of how it ranks sites has affected 12% of search results and halved many sites’ visitor numbers. Named the Farmer or Panda Update it’s only affecting US Google results as I write but if you’re outside the US it is coming to you soon. Here’s how to find out if you have already been hit, are going to be and what to do about it.
Can your business handle a 50% drop in organic (non-paid) visits from Google? That’s what might be coming your way courtesy of Google’s Panda algorithm update.
Before we get into the whys and whats, find out if you’ve been hit by Panda …
If your site gets most of its search engine traffic from the US then you probably already know if you’ve been affected by Panda or not. With this guide you can see the details of the damage and learn how to analyse where problems might be.
If your site is not US-centric then follow the steps below to see if you will be affected when Panda rolls out across the world.
First go to your GA dashboard.
If your site is not US-centered then you might see something like the graph below and think all is well.
But dig deeper. Go to the Search Engines report in the Traffic Sources menu (and choose ‘non-paid’).
Then click ‘google’ to see Google-only traffic (see below).
Click ‘Source’ column heading (which reveals a large sub menu) and then ‘Country/Territory’.
Enter ‘United States’ into the Filter at the bottom of the list of countries.
Press ‘Go’ and hope you don’t see this:
That’s more than a 50% drop in organic (non-paid) visits from Google US. Scared yet?
Choose ‘Advanced Segments’ from the left hand menu and then ‘Create new segment'.
Configure with:
• ‘Medium’ Matches exactly ‘organic’
and
• ‘Country/Territory’ Matches exactly ‘United States
and
• ‘Source’ Contains ‘google’
That looks like this:
Perhaps name that segment ‘G US organic’
Apply this segment to your GA reports and all the data you see will now be for this segment of visitors only. As you’ll see below, this allows you to look at which of your pages have faired best and worst from Panda.
Before we do that, let’s explore what Google are trying to do.
"This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on."
The last thing Google wants is searchers being unhappy with what they find. They might try another search engine if that happens.
Few people other than the low-quality sites’ owners and their investors will have a problem with that.
But all major Google updates leave ‘collateral damage’ behind them: sites that just don’t match the target or deserve to be penalised. Google are aware of this and so have asked those with “a high quality site that has been negatively affected by this change” to let them know about it here.
So if you have a high quality site that’s been adversely affected by Panda Farmer then let Google know.
The site used as an example on this page is a high quality site hurt by Panda. It’s core content is hundreds of long in-depth specialist articles plus a Q and A based forum for readers’ problems.
Perhaps the Q & A pages are the problem (those pages could like thin content to Google's robots). But then I know of two similar sites in different markets that have also been hit but don’t have the Q & A based forum. No, it wont be that easy to work out why an innocent site has suffered.
Cutts and Singhal revealed their process which I’ll summarize as:
• Conduct qualitative research (that’s speaking with individuals and not a big questionnaire) to find out which of a sample of sites they considered to be low quality and why.
• Use the results to define low quality sites with the factors that Google can measure. This gives Google a mathematical definition of low quality.
If we start here, we can think of a number of factors that Google might be able to measure to define low quality, including:
• A high % of duplicate content. This might apply to a page, a site or both. If it’s a site measure then that might contribute to each page’s evaluation.
• A low amount of original content on a page or site.
• A high % (or number) of pages with a low amount of original content.
• A high amount of inappropriate (they don’t match the search queries a page does well for) adverts, especially high on the page.
• Page content (and page title tag) not matching the search queries a page does well for.
• Unnatural language on a page including heavy-handed on-page SEO (‘over-optimization’ to use a common oxymoron). Eg, unnatural overuse of a word on a page.
• High bounce rate on page or site.
• Low visit times on page or site.
• Low % of users returning to a site.
• Low clickthrough % from Google’s results pages (for page or site).
• High % of boilerplate content (the same on every page).
• Low or no quality inbound links to a page or site (by count or %).
• Low or no mentions or links to a page or site in social media and from other sites.
If any of these factors is relevant to Panda, it is unlikely that they will be so on their own.
Combinations of factors will be required to get ‘Panda points’ (and points do not mean prizes in this game). Panda points will be added up. Cross a threshold (Panda’s redline) and you are ‘blocked’.
‘Blocked’ is Matt Cutts’ word, used in that Wired interview: “Whenever we look at the most blocked sites, it did match our intuition and experience”. This suggests that …
… if a site gets defined as low quality then a penalty is applied (it is ‘blocked’).
Google have since said that “low quality content on part of a site can impact a site’s ranking as a whole.”
But ‘Low quality’ sites are not always ‘blocked’ (Matt’s use of ‘most’ tells us this). So there must be exceptions to this site-wide penalty.
Go to Content > Top Landing Pages. See below.
(Remember, in this segment we are only looking at visits from organic searches on Google in the US so we have no need to restrict the GA report beyond ‘Landing pages’.
This report lists all 4,272 landing pages. To test if all pages are equally affected by Panda we can filter the report to show:
• Individual pages. Select a sample and look for exceptions to the visits drop shown above
• Types of pages that can identified by shared strings in their URLs. Eg, forum pages might all have /forum/ in their URLs.
Use the filter at the bottom of the report to do this. Eg:
I’ve done this on a few sites hit by Panda and I can say that some pages were hit more than others and a few did well out of Panda.
So Farmer Panda is at least to some degree working at the page level.
I found lots of pages with high quality, unique in-depth (sometimes thousands of words long) articles that were hit much harder than average. So again, there are no simple answers here. But these did have more advertising on them than average for the sites concerned.
Some forum pages had significant increases in visits. These had long threads, a fair amount of advertising on them (including a pop-up) but less than some other pages.
On this site, I would try changing some of the advertising. In particular, there is a big ‘block’ of advertising that doesn’t feature on the forum pages.
That might not be enough or have any affect at all. For example, on another site I’ve seen hit by Panda, all marketing was removed and no changes have followed (more time might be needed though).
• Find a page that gets results for different keywords
• See if Panda has had different affects on traffic for those different keywords (but to the same page).
If it has then Panda is operating at the keyword level.
I’ve only seen a few examples of Panda having reduced visits to the same page with some keywords but not others. But they are the exception.
The suggestion that Panda operates at the page and site level was supported when I searched on Google US with a unique text string (in quotes) from a near 10-year indexed in-depth original specialist article that had dominated a niche in Google’s results for most of those 10 years. I saw:
• 36 scraped versions of the article. • 2 showing above the page with the original. • 1 of these being a low-quality scrape on a low-quality site. • The other being a part scrape that credits and links back to the original. • The original page has lost 75% of its organic US Google traffic since Panda. • That traffic came from over 1,000 different keywords and of those I tested none had been spared.
"If you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”
Let’s add a bit more to that, put it into practical actions and make a process …
• Find the pages and page types hit worst on your site.
• Isolate differences between those hit and those not.
• Test changing those factors on hit pages but use this method of analysis with caution becausethe pages hit most might not be the pages earning you the penalty .
• Make a list of your different types of pages. Eg, forum, quality article, low quality article, light category, quality category, product, blog post, etc. Put the list in a column in a spreadsheet and start building a table.
• Add columns for relevant factors like ‘lots of ads’, little content, some dupe, all dupe, etc and also number of pages and % drop in Google US organic visits. Fill in the values for each type of page.
•Look at how much of your site (% of pages) is taken up by your lowest quality pages and improve that .
• If you are scraping or otherwise copying other site’s content, replace it with quality original content or test removing some (or even all) of those pages (and adding 301s from them to relevant pages higher up your site’s hierarchy).
• If you have a large number of pages with dupe (of your own copy), weak or almost no content, improve them or remove (and 301) them or block them from Google with robots.txt.
• If you have lots of pages that dupe your own copy (eg, as happens with some content management systems and on a lot of ecommerce sites that build new URLs for ‘faceted’ pages) then add relcanonical tags to the ‘duped’ pages. This stops Google seeing those pages as dupes.
• Edit any ‘over-optimized’ pages.
• Improve anything that might make the user’s experience better.
• Offer users more when they first enter a page. Eg, images, videos, attractive text and pages linking to your best, related editorial content.
• If possible, make your content’s language more accessible and more real?
• Promote your content on social media including twitter and facebook.
• Build your brand awareness across the web wherever you can.
• If you’re sure your site is ‘Google clean’ and worthy, let Google know about it but don’t expect this to have much affect.
• Make as many of these changes as you can at once in the hope of shaking off the penalty quickly. With editorial content improving, you can then add back any marketing you are missing, in steps, checking to see you don’t get slapped again.
This article is taken from Google Panda update survival guide | Wordtracker.com Posted by Mark Nunney
More reading
Searching Google for Big Panda and Finding Decision Trees SEO by the Sea.
Can your business handle a 50% drop in organic (non-paid) visits from Google? That’s what might be coming your way courtesy of Google’s Panda algorithm update.
Before we get into the whys and whats, find out if you’ve been hit by Panda …
Has Panda hit your site?
At the time of writing, Panda is only hitting US results. Here’s how to use Google Analytics (GA) to find out if your site is affected.If your site gets most of its search engine traffic from the US then you probably already know if you’ve been affected by Panda or not. With this guide you can see the details of the damage and learn how to analyse where problems might be.
If your site is not US-centric then follow the steps below to see if you will be affected when Panda rolls out across the world.
First go to your GA dashboard.
If your site is not US-centered then you might see something like the graph below and think all is well.
But dig deeper. Go to the Search Engines report in the Traffic Sources menu (and choose ‘non-paid’).
Then click ‘google’ to see Google-only traffic (see below).
Click ‘Source’ column heading (which reveals a large sub menu) and then ‘Country/Territory’.
Enter ‘United States’ into the Filter at the bottom of the list of countries.
Press ‘Go’ and hope you don’t see this:
That’s more than a 50% drop in organic (non-paid) visits from Google US. Scared yet?
Alternatively, use advanced segments to see organic US Google visits
Using Google Analytics advanced segments will give you more power to analyse what’s happening. Here’s how …Choose ‘Advanced Segments’ from the left hand menu and then ‘Create new segment'.
Configure with:
• ‘Medium’ Matches exactly ‘organic’
and
• ‘Country/Territory’ Matches exactly ‘United States
and
• ‘Source’ Contains ‘google’
That looks like this:
Perhaps name that segment ‘G US organic’
Apply this segment to your GA reports and all the data you see will now be for this segment of visitors only. As you’ll see below, this allows you to look at which of your pages have faired best and worst from Panda.
Before we do that, let’s explore what Google are trying to do.
What in the name of Google is going?
The aims of Panda are noble: to remove poor quality sites from the top of Google’s results pages. Or as Matt Cutts, Google’s head of spam, puts it in a blog post announcing Panda:"This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on."
The last thing Google wants is searchers being unhappy with what they find. They might try another search engine if that happens.
Few people other than the low-quality sites’ owners and their investors will have a problem with that.
But all major Google updates leave ‘collateral damage’ behind them: sites that just don’t match the target or deserve to be penalised. Google are aware of this and so have asked those with “a high quality site that has been negatively affected by this change” to let them know about it here.
So if you have a high quality site that’s been adversely affected by Panda Farmer then let Google know.
The site used as an example on this page is a high quality site hurt by Panda. It’s core content is hundreds of long in-depth specialist articles plus a Q and A based forum for readers’ problems.
Perhaps the Q & A pages are the problem (those pages could like thin content to Google's robots). But then I know of two similar sites in different markets that have also been hit but don’t have the Q & A based forum. No, it wont be that easy to work out why an innocent site has suffered.
What factors make a site vulnerable to Panda?
Google like to keep these things secret but the two engineers at the heart of Panda, Matt Cutts and Amit Singhal, gave us some strong clues in an interview with Wired.Cutts and Singhal revealed their process which I’ll summarize as:
• Conduct qualitative research (that’s speaking with individuals and not a big questionnaire) to find out which of a sample of sites they considered to be low quality and why.
• Use the results to define low quality sites with the factors that Google can measure. This gives Google a mathematical definition of low quality.
If we start here, we can think of a number of factors that Google might be able to measure to define low quality, including:
• A high % of duplicate content. This might apply to a page, a site or both. If it’s a site measure then that might contribute to each page’s evaluation.
• A low amount of original content on a page or site.
• A high % (or number) of pages with a low amount of original content.
• A high amount of inappropriate (they don’t match the search queries a page does well for) adverts, especially high on the page.
• Page content (and page title tag) not matching the search queries a page does well for.
• Unnatural language on a page including heavy-handed on-page SEO (‘over-optimization’ to use a common oxymoron). Eg, unnatural overuse of a word on a page.
• High bounce rate on page or site.
• Low visit times on page or site.
• Low % of users returning to a site.
• Low clickthrough % from Google’s results pages (for page or site).
• High % of boilerplate content (the same on every page).
• Low or no quality inbound links to a page or site (by count or %).
• Low or no mentions or links to a page or site in social media and from other sites.
If any of these factors is relevant to Panda, it is unlikely that they will be so on their own.
Combinations of factors will be required to get ‘Panda points’ (and points do not mean prizes in this game). Panda points will be added up. Cross a threshold (Panda’s redline) and you are ‘blocked’.
‘Blocked’ is Matt Cutts’ word, used in that Wired interview: “Whenever we look at the most blocked sites, it did match our intuition and experience”. This suggests that …
… if a site gets defined as low quality then a penalty is applied (it is ‘blocked’).
Google have since said that “low quality content on part of a site can impact a site’s ranking as a whole.”
But ‘Low quality’ sites are not always ‘blocked’ (Matt’s use of ‘most’ tells us this). So there must be exceptions to this site-wide penalty.
Is a Panda penalty applied site-wide or at the page level?
If a penalty is site wide then all pages should experience a similar drop in Google organic traffic. On our example site, let’s use the ‘G US organic’ advanced segment to see if that is so …Go to Content > Top Landing Pages. See below.
(Remember, in this segment we are only looking at visits from organic searches on Google in the US so we have no need to restrict the GA report beyond ‘Landing pages’.
This report lists all 4,272 landing pages. To test if all pages are equally affected by Panda we can filter the report to show:
• Individual pages. Select a sample and look for exceptions to the visits drop shown above
• Types of pages that can identified by shared strings in their URLs. Eg, forum pages might all have /forum/ in their URLs.
Use the filter at the bottom of the report to do this. Eg:
I’ve done this on a few sites hit by Panda and I can say that some pages were hit more than others and a few did well out of Panda.
So Farmer Panda is at least to some degree working at the page level.
Find out what types of page have been hit on your site
If your site has been hit then use the filter on GA (as shown above) to find out which pages got hit most by Panda.I found lots of pages with high quality, unique in-depth (sometimes thousands of words long) articles that were hit much harder than average. So again, there are no simple answers here. But these did have more advertising on them than average for the sites concerned.
Some forum pages had significant increases in visits. These had long threads, a fair amount of advertising on them (including a pop-up) but less than some other pages.
On this site, I would try changing some of the advertising. In particular, there is a big ‘block’ of advertising that doesn’t feature on the forum pages.
That might not be enough or have any affect at all. For example, on another site I’ve seen hit by Panda, all marketing was removed and no changes have followed (more time might be needed though).
Is a Panda penalty applied at the keyword level?
To find out if Panda is applied at the keyword level and not just to pages you can:• Find a page that gets results for different keywords
• See if Panda has had different affects on traffic for those different keywords (but to the same page).
If it has then Panda is operating at the keyword level.
I’ve only seen a few examples of Panda having reduced visits to the same page with some keywords but not others. But they are the exception.
The suggestion that Panda operates at the page and site level was supported when I searched on Google US with a unique text string (in quotes) from a near 10-year indexed in-depth original specialist article that had dominated a niche in Google’s results for most of those 10 years. I saw:
• 36 scraped versions of the article. • 2 showing above the page with the original. • 1 of these being a low-quality scrape on a low-quality site. • The other being a part scrape that credits and links back to the original. • The original page has lost 75% of its organic US Google traffic since Panda. • That traffic came from over 1,000 different keywords and of those I tested none had been spared.
What to do if you’ve been hit by a Panda
Google suggest:"If you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”
Let’s add a bit more to that, put it into practical actions and make a process …
• Find the pages and page types hit worst on your site.
• Isolate differences between those hit and those not.
• Test changing those factors on hit pages but use this method of analysis with caution because
• Make a list of your different types of pages. Eg, forum, quality article, low quality article, light category, quality category, product, blog post, etc. Put the list in a column in a spreadsheet and start building a table.
• Add columns for relevant factors like ‘lots of ads’, little content, some dupe, all dupe, etc and also number of pages and % drop in Google US organic visits. Fill in the values for each type of page.
•
• If you are scraping or otherwise copying other site’s content, replace it with quality original content or test removing some (or even all) of those pages (and adding 301s from them to relevant pages higher up your site’s hierarchy).
• If you have a large number of pages with dupe (of your own copy), weak or almost no content, improve them or remove (and 301) them or block them from Google with robots.txt.
• If you have lots of pages that dupe your own copy (eg, as happens with some content management systems and on a lot of ecommerce sites that build new URLs for ‘faceted’ pages) then add relcanonical tags to the ‘duped’ pages. This stops Google seeing those pages as dupes.
• Edit any ‘over-optimized’ pages.
• Improve anything that might make the user’s experience better.
• Offer users more when they first enter a page. Eg, images, videos, attractive text and pages linking to your best, related editorial content.
• If possible, make your content’s language more accessible and more real?
• Promote your content on social media including twitter and facebook.
• Build your brand awareness across the web wherever you can.
• If you’re sure your site is ‘Google clean’ and worthy, let Google know about it but don’t expect this to have much affect.
• Make as many of these changes as you can at once in the hope of shaking off the penalty quickly. With editorial content improving, you can then add back any marketing you are missing, in steps, checking to see you don’t get slapped again.
This article is taken from Google Panda update survival guide | Wordtracker.com Posted by Mark Nunney
More reading
Searching Google for Big Panda and Finding Decision Trees SEO by the Sea.
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.