This year I was working on a website project I called the Summer of 210,000 Words. It was an aggressive experiment that focused on content and a mix of mostly white hat and a touch of gray hat SEO tactics.
It started taking form around February 2011.
It hit its stride in June 2011.
When Panda 2.3 was released on June 16/17th, the site’s traffic traffic doubled.
Then on midnight, June 24th EST, Google turned the tap off.
That morning, SEOMoz released their weekly whiteboard sessions. In it, SEOMoz CEO Rand Fishkin prattled off aimlessly with his trademark giddy-assed smiled about the miraculous wonders of Google Panda and the algorithm’s machine computing ability as if Navneet Panda discovered the cure for AIDS.
My computer screen almost got a pen through it. I was a wreck because I wasn’t sure what happened so I read everything I could find for the last two days to see what I could do to Panda-Proof my site.
I read Google Fellow Amit Singhal’s list of signals that Google considers for a high quality site aka – the Google Panda Quality Guidelines. I about threw up. They were so superficial and condescending…yes, I’ve written a high school report before, thanks, a**h***. My favorites were as follows:
- Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
- Does the article describe both sides of a story?
- Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
As if in a horrific moment, all the diverse cultures, topics and interests that reflect the many nations, communities, cultures and ideas were silenced and glazed over in a shuttering shade of gravestone gray. Because now, video, comic, picture and forum sites would be rendered obsolete if you took their explanations verbatim.
Let’s not even talk about giddy-ass’s suggestions
“If I get to a page about a motorcycle part and I am like, “God, not only is this well written, it’s kind of funny. It’s humorous. It includes some anecdotes. It’s got some history of this part. It has great photos. Man, I don’t care at all about motorcycle parts, and yet, this is just a darn good page.”
Once upon a time, Google was about relevance. You used to go to it for information. Now, they want you to go there for entertainment so it feels. So, if you’re in a dry, blue collar market you’re no longer supposed to provide pertinent information, you’re supposed to sing for your supper because Search Engines are gearing towards the addle-minded attention span of a teenager. If it doesn’t get Facebook likes or Twitter RTs – and of course Google +1s – it can’t be cool. Right?
Thanks, asshole.
Oh look, a bird…
But these are the rules we have to abide by now
How to Optimize for Google Panda
After I got over his his on-screen orgasm, I discerned some of Rand’s finer points for my first round of Google Panda Changes:
1) “Optimize around your user / usage metrics” – all those things in Google Analytics – time on site, pages per views, bounce rates…they all come into play. Now once again, Google & SEO Moz’s KFC / Colonel’s Secret Recipe strategy assumes you know what your metrics are. Blog / news site, anywhere over one page per view is what I consider good. Shopping sites…figure about 4 or more. Time on site? do cartwheels if you can get over 50 seconds.
Other changes to improve on usage metrics:
- add related posts / articles feature
- add most read / featured column
2) Quality Content vs. Optimization – Rand’s really got a “That Was Easy,” button up has a** about being more intrinsic with your content. Go for the great content as opposed to where the keywords are placed. Anecdotes as if you’re giving a Rotary club speech and humorous -isms. My advice: do both. SEO Copywriting is an art & a science. I’ll still use my onsite SEO checklist though I won’t worry as much about where the keywords are spaced throughout the content.
3) Cut down on superfluous content – When Fiskin said, “if you have a bunch of pages that are low quality on that site, they can drag down the rankings of the rest of the site,” I think that’s when I had my emotional crash. For example, my post site had 300 posts and 950 tags. Those tags pages didn’t even have snippets. Neither did the category pages. So, after I cleaned up the category URL structure, removing the /category/ from every category and 301ing the old links to the new, I removed most all the tags. I only kept the ones I could tag to at least 10 posts. And then I set it so every article in the tag pages had snippets. The same for categories. Then I went back and wrote at least 200 words of content on the categories and tags and filled out the metadata.
4) No-Followed all my advertising links – I manually put rel=”nofollow” on all tracking and advertising links. I also reduced the number of their appearances from 5 to 3. While I was at it, I read mixed-reviews on the internal advertising link management system so I removed it.
5) Removed anything that could appear to be blackhat offsite SEO – I was working with a consultant who went above and beyond the, “engagement strategy,” looking for quantity of links instead of quality and after the fourth argument, I stopped it all. A weak later, then site’s traffic stopped. Figured offsite SEO was a factor so I deleted all the pages that had offsite SEO running to it and repurposed the content and waited 8 weeks for Google to filter the pages out.
6) Updated the Hell out of my Robots.txt file – With help from Douglas Karr & Allyn Hane, I cranked up what my Robots.txt file filtered out, going after an assortment of comment links and filtered out the blog roll from the site…the .com/page/2/ and so on. I also nofollowed all the .com/feed links that got indexed as well as the internal ad links that also started showing up in Google’ index grrr*(&(*@#&@. Why? Even though I had removed the system, the links appeared on other sites thanks to some scraper a*******.
Then I Submitted My Google Reconsideration Request
Much to the facepalm of Allyn Hane.
“[Finn] You’ve got balls.”
“I’ve done it before.”
The result….
“Reconsideration request for http://www.uRL.com/: Site violates Google’s quality guidelines”
Wait, What does that mean
It means it wasn’t Panda.
How’s that Possible?
We double-checked our server logs and discovered that day, June 23rd, that Googlebot deep-crawled the site. What we realized is that when Google does a Panda Update, they also deep crawl everyone’s site – thus the need for all the computing power and the reason they can only do it manually and not let it run 24/7. So when Googlebot deep-crawled my site, they found stuff they didn’t like and they gave me a manual penalty.
What I Did Next?
I talked to a couple old friends and found one who specialized in this sort of things (for fear of giving away her actual covert job). She gave me a nice little hint,
“I won’t tell you the secret, but I will say that you’d probably be surprised at what would happened if you actually finished building out your website pages.”
Then I Registered What I Needed to Do
Working at the old agency, we’d seen the build of at least 1,000 websites. All of them had template Contact Us, Privacy Policy, How It Works & Terms of Use pages. I was so used to seeing them that I took them to be givens. I hadn’t yet published those pages. Why? I’m a f***-up who was thinking when the site got to a certain number of hits a day that I’d finish building the site then. So I did the following:
1) Published the unpublished Privacy Policy page
2) Published the unpublished Contact Us page – including physical address
3) Added a Industry Resource & Work with Us Page
4) Moved the blogroll over to .com/blog
5) Put those new pages in the header navigation
6) Put the new pages as well as the categories into the footer for a Fat Footer
7) Moved the blog to .com/blog. Now that category links were in the footer, I removed them from the side, cleaning up the look
8 ) Put the site social buttons (Google, FB & Twitter) in the header along with the site search bar
9) Made the Home Page – including a javaScript slider, 400 words of content that describe the site, its purpose, recommendations to other features pages (blog, contact, resources) and featured a Top 25 posts – per A Matt Cutts Google Webmaster Video I saw.
10) Resubmitted the Reconsideration Request to Google on Halloween night after the kids trick-or-treated.
The result?
“Reconsideration request for http://www.url.com/: Manual spam action revoked
Got my Google traffic back 36 hours later. It’s still crippled and the pages I removed out of fear of rogue offsite SEO used to be most of my traffic but it’s running again. Up to %1200 of what it was a month ago. And I’ve got a lot of catching up to do.
How to Tell if the Loss of Traffic Is Google Panda or Google Spam Penalty
Pending industry, seasons, whether or write for stuff that’s promoted or write purely for SEO, the reasons could be several. As Dr. House would say,
“When in doubt, plan broad.”
With that in mind, try these rules of thumb:
1) If you lost 30% of your traffic ceteris paribus, it was probably Panda. If you lost %90, it was probably Google Spam Penalty
2) If your ranks dip only a few spots or to page 3, it’s probably Panda. If all your pages sit no better than page 5, including for your domain, it’s probably Google Spam penalty.
End Game
It’s been 15 days and things are getting better. Some keywords I used to offsite SEO for are ranking better without offsite SEO. The site’s also older. If research and experience mean anything, I’ll get over Google’s next big trust hurdle around February 2012 and I’ll get another uptick and then when I get to May 2, 2012 and things are good, I should have it back to all systems go and back in Google’s full trust. This means the possibilities could be endless.
I don’t even hate Rand Fiskin as much, but things stick with you a certain way, pending the mood and situation.
It will be a long year, but it will be worth it in the end.
But this is my Google Panda Story, For Now
I hope this helps.
Image Credit: Wikimedia.org under creative commons for use with modification – at least, according to Google.
Leave a Reply
You must be logged in to post a comment.