Latest News

Keep up with the ever-changing world of SEO and digital marketing.

How Google’s Algorithms Identify (and Punish) Websites that Violate Guidelines

4 Nov

Ah, Google. The global technology giant that has grown from a search engine to an actual verb we use in our day-to-day vocabulary (‘Who’s that actor?’ – ‘Hang on. I’ll Google it’).

They provide us with not only the biggest and smartest search engine in the world, but additional online services like advertising, cloud computing, and software.

Over the years, they have never stopped improving their technology in order to fulfil their core mission, which is to provide users with the best information in the best way possible.

And how do they do that? With a little help from their algorithms, that’s for sure.

Google doesn’t typically announce algorithm changes in advance; rather, they will confirm an update after it’s occurred, but sometimes, they won’t even officially confirm it.

It’s this ambiguous communication that sends SEO agencies to delve deep: we conduct research and find out the ways in which any update will affect a website, and what we can do to improve our rankings – and simultaneously avoid punishment.

How many algorithm changes have there been?

Hundreds that we know of… plus hundreds more that we probably haven’t even heard about!

You’ve likely heard of the big ones, including Panda, Hummingbird, Penguin, Pigeon and, more recently, Mobilegeddon. These are the ones you’ll read about the most, and whose effect we see on ranking.

They affected the rankings of websites using common black hat techniques, like content farms, low quality content, keyword stuffing, and doorway pages.

There are hundreds of further algorithm updates that are so slight they affected small amounts of traffic, did not warrant announcements, or were not detected by webmasters.

As Moz tells us, Google updates their algorithm hundreds of times a year, but most are continuous tweaks and therefore, individually, have rather minor impact.

Their endless resource shows all the changes that have had the biggest impact on search and websites from as early as 2000.

It’s a list that’s constantly evolving, and, to date includes around 130 changes and updates.

How are websites identified, and subsequently punished, by Google’s algorithms?

There are three common ways in which websites are identified and consequently punished by Google.

1. Using black hat techniques

They’re tempting, yes, but bound to get you punished.

These techniques focus on search engines rather than a reader, and will compromise reader usability and experience, something that is a direct contrast of Google’s aim.

There are a handful of popular black hat techniques that have, over time, been hit hard by Google.

You can find out more in our blog post, which looks at common black hat techniques applied by cheap SEO companies.

If an SEO company guarantees you’ll get to the first position for a certain keyword, run. You can’t guarantee that, as no one actually knows the 200 or so unique signals that make up the Google algorithm.

2. Through manual penalty

If your website violates Google’s Webmaster Guidelines, you might be found by Google’s own web spam team and punished accordingly. You can find out any manual action taken on your site through Search Console (formerly Webmaster Tools).

Just click ‘Search Traffic’ and select ‘Manual Actions’.

Some common practices that are identified by Google and that inflict manual action include:

  • Spam
  • Hidden text
  • A hacked site
  • Keyword stuffing
  • Thin, low-quality content
  • Unnatural links to your site

3. Impacted by an algorithm change

Those animal-themed algorithm updated we glossed over above provide great context for instances where an update has negatively influenced a website’s search ranking.

  • Panda, for example, aimed to return high-quality sites to the top of the search results, thus pushing ‘thin’ sites further down.
  • Penguin sought out websites that were using black-hat techniques to advance their position in the search results. Penguin affected about 3% of search queries.
  • Hummingbird took an advanced look at the ways users entered keywords and the context behind the actual words used in search queries. Websites with natural content that read well were the winners here!

Google has even had to punished itself!

In a bizarre twist, Google itself employed thin content and paid links – two of many factors they fight against – to promote their web browser Chrome back in 2012.

Chrome’s marketing team purchased sponsored blog posts and embedded a video promoting their browser.

Not only did they pay for links, but also the blog posts they were promoting their product on were low quality, thin, and hardly relevant to Chrome.

Google’s webspam team issued a manual punishment. Chrome’s ranking (for terms like ‘browser’ and ‘web browser’) dropped to the fifth, sixth, and seventh pages of Google.

Chrome was punished for 60 days, and in that time, suffered an estimated 1% drop in market share.

This isn’t the first time Google has practiced what it preaches and punished itself. It has done so in 2011, 2010, and 2005. You can read more about those incidents here.

Google never-ending algorithm keeps us on our toes

Consistent updates from Google are not a means to annoy us, or punish us, or to shame a website.

Rather, their purpose is to keep us on our toes by consistently working towards improving our website, improving our rankings, and, above all, improving the user experience for our valued website visitors.

Remember, Google’s mission is to provide the most useful information in the best way possible to its users – the people, not the machines.

If the information on your website that you provide to readers doesn’t help fulfil that purpose, Google will prioritise another website whose information does!

On the flipside, if you can provide users with high-quality, consistent, valuable, well-written content, then Google wants to reward you!

The reward? Higher positioning in a search engine that processes over 50,000 queries per second.

That’s about 3.5 billion searches per day, and… wait for it… 1.2 trillion searches every year. Wow.