Speaking at the SES San Franscisco conference on Wednesday 15th August, Matt Cutts, Google’s webspam fighting head, confirmed that there will soon be another large update to its Penguin algorithm and Panda will continue with monthly data refreshes. Therefore, website owners should brace themselves and start undertaking any necessary activities immediately, in order to avoid being a ‘loser’ when the update rolls out.

If you own a website it is likely that you have heard of Google’s Panda and Penguin algorithm updates and you may even be aware that they have ready affected your organic search engine rankings in some way; most likely they have caused a fall in your positions. However, do you understand what elements each update is concerned with and, most importantly, what steps you need to take in order to get your website ranking again? If you’re about to stop reading because your webteam or SEO agency ‘has it in hand’, I would recommend you don’t. Unless you understand the pitfalls and nuances of SEO, how do you know those working for you are managing activity swiftly enough and correctly so position drops can be reversed?

Similarly, you may be thinking you don’t need to read on because your rankings have remained stable, but as each algorithm update builds upon the next, it could just be a matter of time before your site is hit. The next Penguin update could turn out to be small, being dubbed as version 1.2, but as it is in its infancy it is likely to be 2.0 and significant (as Panda’s early updates were); Google’s Cutts said “expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact”. On 24 July, Panda version 3.9. was released, only one month after 3.8.; Panda data refreshes are now rolling out on a monthly basis and whilst the signals are stable and the impact relatively small, who knows if version 4.0 may drastically affect your rankings, visitor numbers and ultimately your sales.

First, let’s tackle Panda, which was originally rolled out in February 2011 in the US, followed by an update in April 2011 for all English-language Google users. Panda continues to work its way around the world and was only launched in Japan and Korea last month. Its algorithmic improvement was said to impact 11.8 per cent of search queries and was predominantly designed to stop low-quality sites ranking high in the results pages. Websites with poorly written and duplicate content were affected as well as those with substandard information or unpleasant design, or navigation; therefore, sites that allow others to upload to them without much or any editorial approval were amongst the hardest hit. Google’s Panda was basically asking, “do I totally and completely trust the site, its owners and the information provided?”.

As I said, Google has been continually updating Panda since that time and one of its Webmaster Central Blog posts from May 2011, entitled ‘More guidance on building high-quality sites’, is an invaluable resource for any site owner. The advice is not to fixate on Panda but to focus on providing an all-round quality site to visitors; nevertheless, as Panda’s attention is on quality, one is essentially aligning a site with Google’s mindset. Do read this blog post if you haven’t yet already as it’s one of the most helpful set of pointers ever provided from the search giant, but you must be honest with yourself about the answers. Here are two key questions from the post:

“Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?”

“Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?”

Sites with topics and content driven by genuine interest are obviously the ones that Google favours, not those that include similar pages designed to encapsulate slight keyword variations; whilst it may be hard work and a drain on resources to audit, remove, repurpose and create new content for your site, nobody said optimising a website should be easy. Google and other search engines hold the power in this respect and, quite rightly, they are in the business of wanting to provide the most relevant answers to queries as quickly as possible. Taking off my SEO hat for a moment, I too want to be offered helpful results, not a list of sites that have ‘gamed’ their way into the rankings with poor content that is no use to me; or consider it this way, being positioned high in the rankings and obtaining visitors is a waste of time if they reach straight for the back button after they land on a mediocre page.

Next, let’s move onto Penguin. If quality is the label attached to Panda, then webspam is the watchword for Penguin, which burst into the world on 24 April this year. Generally speaking, it targets activities attempting to gain better rankings that are in violation of Google’s publishers’ guidelines. These include cloaking, underhand redirects, participating in links schemes or obtaining links from low quality sources; the purposeful creation of duplicate content and keyword stuffing, which clearly have a cross-over with Panda’s remit, are also on the list of dodgy SEO techniques. Technically, Penguin doesn’t penalise websites that are involved in these ‘black-hat’ activities but if your rankings have been ‘adjusted’ down the rankings, it will certainly feel like a punishment. Google predicted about 3 per cent of search queries would be affected when the algorithm was launched, and another 0.1 per cent on 26 May, which was when version 1.1 was released.

Despite Google’s on going attempts to fight such spammy SEO techniques over the last ten years, many of them have continued to work to some extent so webmasters, somewhat understandably, continued with activities or left previous undertakings live. Penguin, therefore, came as a huge shock for some and many site managers found they had inherited a whole host of issues thanks to predecessors’ activities. Again, a site audit is a good place to start and I would strongly recommend you undertake a review of your backlink profile, that is, which domains are linking to your site; ‘bad’ links could quite easily be why you’ve suffered a ranking drop and if this is the case, you must take action quickly. Voluminous links from poor quality and irrelevant online directory listings, forums, link farms are the most common of culprits, and it’s very likely Google has sent a warning email to your webmaster console. A post from Google’s Webmaster Central Blog on 27 July is great resource to gain greater context about these messages and what to about them, but the long and short of it is you should remove links being identified as webspam and submit a reconsideration request.

With a basic understanding of the major algorithmic updates, you can not only be better prepared for future changes like the impending adaptation to Penguin, but also ensure that all past activity conforms to Google’s wishes. Although it is likely that a webmaster, web team or agency takes care of your SEO efforts, as a business owner, having this grasp on events will give you peace of mind and allow you to make informed decisions on activity and resource. Undertakings that may have once propelled your website up the search engine rankings could be the barrier to obtaining such a high position again. Ultimately, it’s your responsibility to review and investigate all on and off-site elements, make the necessary changes and play by the rules.

Keredy Andrews

Keredy Andrews

Contributor