Panda 4.2 in Review – The ‘Bear’ Naked Truths You Need to Know

It has been almost two months now since the official introduction of Panda 4.2 and many in the industry are even more perplexed by its impact – or lack of – than in previous Panda rollouts. Where some Webmasters are starting to develop an air of bitterness towards the Search giant, others seems to be taking everything into their stride and accepting that Google, for want of a better word, is gloriously erratic.

Let’s take it from the top; the eagerly awaited nascent Panda 4.2 was greeted with apprehension when it was officially introduced 9 months after its predecessor; an unusual move on Google’s behalf. Gary Illyes, who initially broke the news via a tweet to Search Engine Roundtable’s Barry Schwartz, added that the 4.2 would ‘take a few months to complete’ – news which was unlike any previous Panda.  Even more so was the information leaked by Webmaster Trends Analyst, John Mueller, that the extended rollout was caused by technical problems from their own end.

This news has not been received in great confidence by business owners and SEOers alike as it complicates the ability to attribute changes in traffic to a particular algorithm update. In previous circumstances, Webmasters have been able to watch – in real-time – how updates have affected their website’s performance and made changes for improvement, respectively. Without the ability to do this, mass hysteria has begun to unfold; naturally. What do we think? It’s time to dig in because this, guys, is a war of attrition!


OK, so, as you can imagine there is a lot of speculation circling the search-sphere with many Webmasters finding it hard to determine whether or not the infamous Panda 4.2 has actually affected their website or not. There are a lot of case studies which attempt to resolve this but with so many conflicting opinions; it is very difficult to make concrete conclusion as to what is happening. After wading through the amassing data, it is our opinion that Glenn Gabe (of G-Squared Interactive) has developed the most astute theory. Here is the summary:

Many people have been planning ahead for the introduction of Panda 4.2 and have been hoping for a revised Panda score across their website (basically a positive change in organic traffic). However, what many have received, and what is causing further commotion, is a small increase in traffic which has then be retracted. In some cases, no change has been monitored at all.

We know what you’re thinking… what is Google playing at, right?

Well in most recorded cases, these websites have not gradually improved with the introduction of 4.2 but have actually monitored a jump or surge in traffic. These abrupt spikes in traffic have then been swiftly reversed and have evened out. Webmasters have essentially been delighted with their initial reward for their clean-up efforts, only to have it taken off them a couple of weeks later. It is understandable why this retrieval of traffic may have encouraged them to cry wolf, however, it does not change anything.

One of the more public cases comes from Barry Schwartz and relates to his site the Search Engine Roundtable. Below is screenshot provided by Barry as featured on Search Engine Land.


Rather than labelling these increases as confirmed ‘recoveries’, it is more rational to view them as tests or tremors; something we all know Google are prone to. Test surges are used to send signals to Google to show them how the new algorithm changes will actually impact contentious websites and how this will reflect in SERPs. Once the data has been collated and analysed via the test phase, a more comprehensive refresh can be rolled out. It is important not to forget that we already understand that Google have announced that Panda 4.2 will be an extended update due to technical issues, which, if anything, only supports the theory that the update may still be in beta.

In principle, it is quite feasible that websites which have seen an increase may later be hit by Panda and that those which have not monitored any impact, may recognise a recovery at a later date.

To quote Justin Bieber: “what do you mean?”

It is simply really – don’t be rash; consider all influences; context is key!

The common opinion during any major update is that when organic traffic changes, well, it must be the algorithm? Wrong. It is not that simple and even if it ever was, that is certainly not the case now. Attributing traffic signals to an update, if not glaringly obvious, should be the last port of call. Why? The playing is too ambiguous. In order to fully comprehend how a website is being interpreted, all contributing factors must be exhausted before changes should be associated with an update.

Although this seems longwinded and very taxing, it should provide a more stable foundation to draw a conclusion. Gone are the days of customary Panda-pointing; you’re better than that and we know you are! Dig in, dig deep and we assure you will be justly rewarded (eventually).

If you want to view your traffic alongside known updates, check out Barracuda Digital’s Panguin Tool which provides another visual for you.


Prior to Panda 4.2, it was already an arduous task to accurately determine what caused a hit but this has – and will – only become even more difficult. Google updates its algorithm hundreds of times a year – with 2014 seeing a confirmed 1000 updates – making it significantly difficult to figure out exactly why a website has been penalised and what caused it.

As Google strives for impeccable SERPs, rolling out Pandas and Penguins is similarly becoming progressively more difficult. This is why is it well known that Google fully intends to incorporate elements of Panda and Penguin into its core algorithm. When this occurs, there will be no timeframe to work from; it will be running all of the time and Webmasters will not be able to define an update.

*sighs, takes deep breath and continues reading*

Based on all of the above, it is quite plausible that this will be the very last Panda refresh, ever.

So, what’s next?


Not that we weren’t advocating this anyway. We are now instructing it.

There are elements to SEO which still seem to be a grey area and many Webmasters operate a risk-based strategy when it comes to SEO, very slight even if it may be. Our point is this; it is not worth taking any more risks with grey techniques, regardless of whether or not they seem to be affecting your website positively (for the time being). Although you may not get any brownie points for being a teacher’s pet, you can rest assured that no horrible surprises await you on your journey to SERP supremacy.

Your new to-do list:-

  1. Contently aim to improve the quality of your content
  2. Address all ‘risky’ issues both on and off site immediately
  3. Fix technical SEO problems as soon as possible
  4. Try to drive strong and relevant traffic to your website

That’s a wrap; nothing more and nothing less.

In the meantime, if you have any doubts about whether or not your website is Panda-proof or if your website has recently experienced a change in traffic and you don’t know why, don’t hesitate to contact us – we would be more than happy to help.

No Comments

Post A Comment