Now that Google have officially unleashed their latest ravenous, spam destroying, “animal beginning with a P” update (“Penguin 2.0” in this case) and webmasters have had the best part of a week to comb through the wreckage that was their rankings, is there anything that the latest algorithm shake up missed out? What do webmasters think that it could’ve done better?

Well, firstly it should be noted that this particular breed of penguin is not quite as vicious as its little brother Penguin 1.0. Popular online marketing site Moz.com track the average disturbance in search engine rankings with a tool that they refer to as “MozCast” which simplifies the complex data that indicates widespread changes into easily digestible “temperatures”. The higher the temperature is on any given day, the more algorithmic change indicated.

On May 22nd, when penguin 2.0 dropped, MozCast recorded a temperature of 80.7°F which, whilst above the average temperature of approximately 70°F, is nowhere near the SEO heat wave of Penguin 1 on April 24th 2012 which registered over 90°F.

Head of Google’s Web Spam Team Matt Cutts estimated that around 2.3% of English language search queries would be affected, and added that this number could rise across other languages where perhaps spam is more widespread.

But did the Penguin bite in the right places? Across the industry there are conflicting reports of substantial ranking gains and crippling plummets, all of which is par for the course following an update of this magnitude, but amongst all the usual bemoaning/worshiping of almighty Google there are some people raising legitimate concerns about things that the latest update failed to address.
Following the failures of this update then, what would the industry like to see addressed by future iterations of Google’s algorithm updates (let’s image it’s called Platypus 1.0 – there aren’t many animals beginning with P left…)?

Domain Diversity/Domain Crowding

Domain Diversity (or lack thereof) has been a persistent problem at the top end of SERPs for months now and a number of prominent industry members have been vocal in their calls to address the problem (including our very own Yousaf Sekander!), yet this is an issue that is still worryingly prevalent.
Imagine you were at a restaurant, perusing their lengthy, varied menu. Everything looks great right? It’s nice to have a wide selection of choice. Now imagine every item on the menu is exactly the same. Not so great huh? That’s exactly what domain crowding at the top of SERPs is like. If a user looks at result #1 and decides, for whatever reason, that it’s not for them, what are the chances that they’re going to want to see another page from that domain as #2, #3 and maybe even #4 as well?

Moz handily keep track of this sort of thing for over at MozCast Metrics so we can point at charts with huge scary drops that prove our point! The graph below shows Domain Diversity which is described as, “The percentage of unique sub-domains across the URLs in the data set. The less diversity there is, the more domain ‘crowding’ we observed” and as you can see, its already unhealthy figure has dropped even more sharply following Penguin 2.0 launch day (highlighted in the chart as 5/22).

Domain Diversity

Domain diversity has dropped sharply since Penguin 2.0 dropped on May 22nd.

Spam Spam Spam

Whilst the herd of “P” themed animal updates have done an admirable job of thinning the spam that plagues our SERPs, spam still remains an issue. Even following the latest update there are a significant amount of complaints citing dishonest affiliate and thoroughly black-hat styled websites capturing the top ranking spots for certain search terms. This seems to be a particularly pressing issue in the areas which have typically been worst affected by the influx of spammers caused by the lucrative nature (pharmaceuticals, real estate, short term finance etc.) of the niche and whilst these sectors are typically avoided and regarded as less than savoury by most online marketers, there are good honest people in each of them trying to go about achieving good rankings in an honest manner, who’re still being prevented from doing so by manipulative spammers.

Matt Cutts himself all but acknowledged the continued existence of high ranking spammy sites by creating a page specifically reporting them, and the 14,000+ clicks it’s received since show that the battle against spammers is still on going.

Poor Matt Cutts

Social Profiles

Another problem being reported post Penguin 2.0 is the emergence of social profiles and micro-blogging sites at the peak of search results. Whilst these sites benefit from the influence of a strong, reputable domain behind them, they’re often not what searchers want to see recommended to them as the most helpful search result, particularly if they’re in fact looking for some kind of resource.

More troubling perhaps is the fact that these offending social profiles seem to travel in packs and have been reported to appear as the top 3 or 4 results for certain search queries.

More Relaxed Penalties for Technical “On-Site” Transgressions

Google has always stated that websites should do their best to be as technically correct and efficient as possible, eliminating issues such as duplicate content through canonicals and other such measures. But only recently has Google started actively punishing websites for committing these violations of technical proficiency.
Google has always championed the experience of the end user as their paramount concern when ranking websites, aiming to reward websites that prioritise this value and make a concerted effort to produce a site with great content, that a user will enjoy visiting. Typically technical issues with a website fly directly in the face of this described high quality experience and can often interrupt or confuse a site user.

However, consider a site that contains some technical faults that, whilst significant enough to gain a penalisation from Google, do not affect the user experience of a visitor to the site. Is Google justified in punishing this site when the site is still providing a great experience to its users just because there’s a problem or too in the code that 99.9% of users would never even notice?

Some webmasters feel that after building their site specifically with the experience of the end user in mind, they’ve been unfairly punished for a technical issue which has no bearing on the end quality of the site itself. This raises the risk of webmasters with little to no technical knowledge of SEO – building great, user focused sites that might not even register in Google’s index.

These are just a few of the problems that we’ve seen discussed by webmasters across the industry in the wake of Penguin 2.0 and it’s worth noting that the problems outlined aren’t necessarily opinions which I or any of the other RocketMill team agree with (with the exception of domain crowding which is a particularly sore spot for me!), merely issues we’ve seen raised in the larger community.

Have you experienced any of these issues since Penguin 2.0? What other issues would you like Google to address in future updates?