With some big algorithm updates around the corner, Google has been busy sending unnatural inbound link notifications to a lot of sites in the last few weeks. Some of these notifications look very scary on the surface but if you read the message very carefully, you will realize that Google has taken granular action against certain links and pages rather than a carpet penalty across the entire site(s).

A representative of BBC took to Google Product Forums stating that BBC has received Unnatural Inbound Link notification stating:

“I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’.

Given the BBC site is so huge, with so many independently run sub sections, with literally thousands of agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links’.”

A few hours later, Google’s John Mueller replied to the BBC representative saying:

“Looking into the details here, what happened was that we found unnatural links to an individual article, and took a granular action based on that. This is not negatively affecting the rest of your website on a whole.”

This makes an interesting case given the nature of BBC’s site. BBC is a huge site which probably gets thousands of links on daily basis. How can Google take manual action against such a site? And how can they target specific pages? But the most difficult question to answer is. How can BBC figure out which page has been hit given how vague and generic Google’s notification messages are.

I will be looking into topic a bit further and will update this post…