Welcome back to AIMCLEAR‘s coverage of #SESNY 2013! Alert! Alert! Sudden drop in site traffic! What’s the cause? An algorithm update – raging Panda, ruthless Penguin? Or is it a manual penalty, handed down by a human because something on your pages seemed fishy? It is absolutely essential to differentiate between the two, as your course of action moving forward towards recovery depends largely on the type of penalty you’re facing.
Host with the most Eric Enge, CEO, Stone Temple Consulting was ready and eager on Day 2 of #SESNY to share the skinny on algo vs. manual penalties, and best practices for recovering from both. He was joined by fellow Stone Templers, Andrea Shoemaker– who spoke specifically to Penguin and link penalties, and Charley Spektor, who wrapped up the session with an insightful case study. Fabulous tips were a-plenty. Read on for the full recap, and Eric’s prediction on what the next Google algo’ will entail…
Algo Penalties vs. Manual Penalties
There are two kinds of penalties to worry about: algorithmic (Panda, Penguin) and manual (Google human looks at the site for some reason – possibly turned in by competitor? – and they assess the penalty). It’s easy to tell if you have a penalty. One of the telltale signs is a sudden and sharp decrease in site traffic. When you are hit by a penalty, it’s time to act. Start by assessing what the possible cause of the penalty might have been. In other words, discern which of the two types of penalties you are facing: algo, or manual? This will impact how you move forward with a recovery strategy.
Signs It’s Algo-Related vs. Manual
One sign you’re the victim of an algo update would be if your sudden drop in traffic coincides with one of Google’s algo update roll-outs. Check out SEOmoz’s list of all Google Algo updates. Does your drop in traffic coincide with an algo update? If so, it’s likely not a manual penalty. This isn’t a 100% accurate test – because a manual spank could have happened on the same day as an algo update… but the chances of that happening are rather low.
Another tip: Check your Messages inbox in Webmaster Tools. Google is really good about sending out messages to site owners before (but in some cases, after…) they hand down a manual penalty. If you have a message from Google, you can be fairly sure it’s a manual penalty you’re facing, as Google does not send out messages to sites affected by algo updates. Response time is important, especially if Google sends you a message prior to the penalty. If you don’t respond in a timely manner and address the concern, Google will go ahead and implement the penalty. Try to make it right before the rank-spank!
Another another tip: Google’s messages are usually specific. Don’t wonder why you’re getting the penalty – Google will probably tell you. E.G. if your links are the problem, they will mention specifically that your links seem to violate guidelines, as opposed to content.
Tracking & Trashing Spam: Google’s Getting Better!
If you look back at the timeline of algo updates, Google only started to do them with regular frequency a couple years ago- with the first release of Panda in February, 2011. Eric believes this is due to a major shift in Google’s ability to chase down SEO practices they do not like, and spank accordingly. Simply, this capability didn’t exist before.
Panda: What It Is, And How To Recover
Primary causes of panda include:
- Poor quality content
- Thin content
- Large-scale non differentiated content
- Massive scale duplicate content
- Content with poor users interaction metrics
Panda does NOT involve links. It’s about content. Here are some other factors Panda considers in the big picture of content:
Sameness! A Google search for “frogs” brings back a SERP with content on frogs. Fair enough. User clicks on the first link, which has content about frogs being green, water-dwellers, and not to be confused with toads. User doesn’t get the info he/she wants, clicks back to SERP, clicks to second link. More or less the same information is there, laid out differently. Not duplicate content, but the same kind of content. User clicks back to SERP and clicks on third link. Same story.
By now, user is frustrated! This is where Google’s perspective comes from. This is what you have to understand for the entire content strategy of your site – well beyond Panda: your site must be worthwhile… to Google and users. What unique value are you providing? How are you different? Are you different enough? Your page has to be different enough – in a way Google can measure it. If you have a Panda issue, it may be that your pages aren’t different enough.
Search Results Interaction. If someone clicks on the first result, clicks back to SERP, then clicks the second result, and if this happens a lot – that’s not good for that first site. Google will take notice. They’re measuring user engagement with content.
Chrome Blocklist Extension. Google also has an extension that was a big factor in the second release of Panda (April, 2011) – it started taking input from people using this extension to say, “I don’t want to see this site anymore!” If a lot of people indicate they don’t want to see your content, they’re essentially voting your content down.
Site Previews. In-SERP preview of page – if a lot of people pull up the preview page, but don’t click, they’re saying, “You showed me the page and I chose not to go there.” Wahhnn wahhnn.
Social Metrics? Currently, there aren’t real concrete ones… but Google has confirmed that it measures the time users spend on your site. People click on articles in the search results with author snippets – if they spend time on the site for 2-3 minutes, then click back, Google adjusts their SERPs accordingly.
Metrics you can use…
- Google Analytics – pages per visit, average visit duration, bounce rate, % of new visits vs. returning. These are signals of whether you’re getting engagement on your website with content. You can use these – and you should look at these from a user engagement perspective.
- Services like Compete.com and competitive data offer insightful goods. Eric shows page views per visit and average stay on a line graph for his site and a competitor’s site. How does the engagement of your content measure up with the engagement of competitors content? Think of how Google sees this data and uses it.
- Visits / Person is also an interesting metric. How many times does the average person comes back?
Eric’s advice: If you’re not doing well – do the right things to your site. Do the hard work!”
Find Poor Quality Pages
Look at pages and assess their quality. Think back to the frog example. Did you say anything new? Is there anything useful on the page that’s not being done on other pages?
Find Pages with Bad Engagement
Check out audience overview and look at bounce rate vs. unique visits. High bounce rate and little traffic indicate there’s a problem. 100% bounce rate and 21 unique visitors? Ouch. Kill the pages. If you have a Panda situation going on, these are probably part of your problem. Take them out. Or NoIndex them. Not a lot of traffic anyway? Good chance they’re hurting you.
Solution 1: 301 redirect. Great to do with a poor quality page. Redirect to the closest match. This is another form of “killing” the page.Â
Solution 2: NoIndex. Think users need the page, but Google isn’t valuing it? Maybe traffic isn’t high and the bounce rate is high? NoIndex it so it stays on your site, but is taken out of the Panda calculation for Google.
Solution 3: Improve the page. Spend a ton of money and get a great content team to get together and improve the pages. This is the best solution in many cases, it just depends on how important the pages are to you. Make sure they do something really unique and different.
Then…. Be patient. Panda now does rolling updates. You may need to wait up to 3 months for Google to find your changes and index them. It depends on your site’s crawl rate.
Key lessons: Be very harsh! Be aggressive. Cut to the bone. Get out from under the problem and work your way up. The cycle to see results is very long. There’s no win in trying to cut corners.
Why do certain pages get away with thin content? (Like Amazon for shopping results, etc.)
- Overall site authority. Brand matters to Google. It’s a good signal of authority and that users like the site.
- High user engagement and links.
Andrea is up next to discuss Penguin.Â
Do you have inbound links from any of these shady sites? If you do, get rid of them:
- Article directories. Most are poor-quality. Some are okay, like Yahoo Directory, dmoz, business.com, and so on. Maybe one or two very industry specific ones and high quality ones… otherwise, they’re all bunk.
- Country Domains Where You Don’t Sell
- Foreign Sites With Links in Different Languages
- Comment Spam. Comes in all different forms. Be wary.
- Guest Post Spam. A good way to get links to your site, if its done correctly. But there’s a lot of spammy junk out there L Links in attribution (author byline) are okay… but links within the article is more of an advertisement for the site – signifies to Google maybe it’s paid for… if they’re NoFollows, they won’t hurt you. Also keep in mind the quality of the guest post itself.
- Guest Posts Not Directly Related To Your Site. Also very common. Feedback from Google suggests those links and that content are gross.
- Links Within the Guest Post. As previously mentioned… suspicious!
- Non-relevant Infographics
Sidebar: Does Your Anchor Text Profile Look Phony or Natural? Phony Distribution: Even, large distribution on key phrases, small % of domain anchor text or brand name. Natural Distribution: Majority is brand names, then domains, some generic KWs, then targeted KWs, but representing a smaller % of the overall.
- Misleading Anchor Text. Not as obvious… but still a no-no! Guide to paying for college goes to homepage of finance site, not to an actual guide. Booo.
- Sites with Malware. No, no!
- “Sponsored” Links. Even on a quality site, even on an .edu! Not good. Remove those links.
- Footer Links. Gross paid links in the footer along with the basics: Home, About, Contact, etc.
- In a List f Non-Related Links. Sometimes these lists may be framed as “resources,” but they links are actually quite unrelated. If you see your link on a list like that, remove it.
- Poor Quality Sites. Thrown together paid links / link farm page. Gross.
Simplifying The Clean-up Process
Eric took the stage again to run through a sample link profile and how to simplify the clean-up process for this massive project.
Sample Link Profile:
- Links: 1,273,452
- Linking Domains: 11,897 (Out of the 11,897 linking domains, 231 were good. The rest needed to be removed. Ouch!)
OMG almost 1.3 million links?!?! I cannot go through all of these! How do I get started?
Simplify.
It’s the number of linking domains that matters.
So first thing’s first: 11,897 is way less than 1,273,452. That said, you still have to look at every single linking domain.
“Your task isn’t to do a lot of work to clean up your link profile. Your task is to do ALL the work!” Eric points out that’s a hard thing for people to accept, especially because Google was so bad at detecting bad links for such a long time – people started to believe it was permitted by Google. That’s not true, they just weren’t good at catching it – but now they are. So you really do have to do ALL the work.
Don’t worry. Just keep on simplifying your task:
Continue by Categorizing Those Linking Domains
For example:
- Blogs
- Pages with multiple links
- Sites with more than one linking page
- Rich anchor text links (not forbidden, just be wary of patterns suggesting there’s too much)
- Links form comments
- Links from the same C clock
- Nofollow’ed links (highlight these, because you don’t even have to look at them – not the source of a link penalty)
This speeds analysis and cleanup! Keep on making the analysis process easier. Compartmentalize the link groups and assign out to people specialized in that kind of link. For example, Eric takes the blog links and gives it to people trained to deal with blog links (there are even cheap solutions– college students, for example).
Automated Tools: A Good Solution?
Fully automated tools can’t do the entire job. There are great tools out there intended to help you with cleaning up links to sites. You run the tool, it tells you what you have to do, you do it, but you will NEVER get in. they’re not thorough enough. They can’t learn the nuances of what you’re trying to do.
Criteria for Link Evaluation
As you move forward assessing each link, consider…
- Would you have invested the time to get think link if Google did not exist?
- Would you proudly show this to a prospective customer, pre-sale?
- Would you show it to your children?
- Did the person giving you the link mean it as a genuine endorsement? (Google might even start to discount infographic links in the future – if it’s irrelevant to your site).
- If you have to argue that it’s a good link, it’s not. A good link requires no argument! If you have to think about it, it’s gotta go!
Final Thoughts
- Google wants links to be like an academic citation. You can’t buy that sh*t. It’s genuine! The whole ranking algo is built on this foundation.
- Google doesn’t want it to be bought, bartered, or stolen.
- Google reviewers are only human, AND that human may be in a bad mood on the day he/she is reviewing your site. Keep in mind: You started out as a spammer in their eyes. They’re living is designed around improving search quality. You violated it. You have to set the table so they feel good about what you’ve done. That’s why comprehensiveness is important. If you get lazy and just shove your links in the disavow tool, that’s what they’ll see you as: Lazyhey wont believe you have reformed. Show them you have! Be respectful. Don’t whine about what you lost. Be short and to the point.
Finally, Charley was up to share a case study of one site’s experience with penalties and reconsideration requests. My hands are cramping, so let’s blaze through it at a high-level!
- Jan 6 – site received a low quality page message
- Traffic dropped off by 75% starting that day
- Actions taken: eliminated 1k+ thin-content, template-based “city” pages, which had been added in summer 2012. The addition of strong new content to 48 “state” based pages remaining on the site. Submitted reconsideration request through Webmaster tools.
- Jan 25 – Google’s Written Response: “You didn’t do enough. We believe some or all of the pages still violate our quality guidelines.”
- Meanwhile, Google partially lifted the penalty. There was a slight increase in traffic.
- Actions taken: further improvements to onsite content, combined several top-level category pages to reduce possible duplicate content issues. Submitted second reconsideration request.
- March 12 – another response from Google. Now, links that are violating our quality guidelines. What? Started with low quality pages, and now the 3rd response from Google makes a 180 shift from content to links.
- Takeaway: you can clean up your site for low quality pages! But if a Google engineer if looking at your site and they think it’s ill, it’s looking at the whole patient. Once the patient is open, Google is looking at the whole thing.
- Luckily, now this site has an indication they also have low quality link issues.
- Actions: accelerated efforts to clean up bad links. Fortunately, anticipated that and had already begun the work. Third reconsideration request planned for this week!
FINAL GOODIE from Eric: The next version of penguin will be a bigger shock – it will include an attack on low-relevance guest posts.
Phew! That’s all she wrote! Big thanks to the crew at Stone Temple Consulting for sharing their tip-top advice with us today. Stay tuned for more #SESNY coverage right here on AIMCLEAR blog!