What were the main Google updates in 2015 and how did they affect your blog? – WAU

If you practice Searching Engine Optimization (SEO) actions on your website, you need to be concerned about changes in Google’s algorithm. And how not? After all, there are many efforts to position the pages in good positions in the search engine. This involves long hours of planning, executing and measuring results. So when updates come […]

If you practice Searching Engine Optimization (SEO) on your website, you need to be concerned about changes to Google’s algorithm.

And how not? After all, there are many efforts to position the pages in good positions in the search engine. This involves long hours of planning, executing and measuring results.

So when Google updates come, professionals quickly need to adjust to new ranking standards.

The community moves. New techniques are created. Sites are penalized. And, of course, many adaptations are made.

The updates services are a way to improve the user experience and not give practitioners a chance black hat. And, in keeping with the new trends in SEO, new opportunities arise to achieve the highly desired first places in the engine.

Therefore, understanding what changes have occurred can help you measure the effects on your strategy. After all, they remain active.

If you care about Google updates and want to know more about them, don’t worry. In this post, you will get to know each of the updates, as well as its effects, causes and objectives.

Come on?

What are the main updates from Google

Panda

Launched in February 2011, Panda’s main objective was to prioritize the quality of content. It is clear that choice of keywords continued to have significance in the searches, but this update aimed to highlight the texts that best answered the user’s doubts.

This was one of the biggest revolutions in the SEO world. In the United States, 12% of searches were affected by version 1.0.

Thanks to it, even domains with exact search terms can be positioned below pages with a smaller relationship with the keyword, but rich content.

The update also penalized sites with low-quality, spameous, very shallow content or practices that simply aimed at good ranking, such as the massive use of terms (called keyword stuffing).

In addition, duplicate content and plagiarism have become easily identifiable. From there, there would be no more Ctrl + C, Ctrl + V, and sites that did would have significant drops – or even exclusion – in the main search engine.

However, in its 1st update, the goal was to put an end to the content farm (content farm, in free translation).

It is simple to understand the concept:

Sites selected relevant keywords and hired multiple copywriters, often without knowledge of the topic, just to position themselves at the top of the SERPs. Usually, the result was texts without much depth or irrelevant information.

But calm down:

Sites with multiple copywriters can still be well positioned (as is the case with marketingdeconteudo.com), provided they have valuable content.

In short, what happened at that time was that the quality of the content became the differential for good positioning. Of course, the biggest beneficiary of this was the Internet user, who started to avoid pages with low quality content.

Initially, Panda was only one of 200 ranking factors. However, since then, this kind of filter has been updated regularly and, in 2016, it was definitively incorporated into Googlebot.

In its later versions, Panda:

  • encouraged the updating content and references to good research sources;
  • improved the spelling filter and reading fluidity; and
  • combined audience engagement as one of the ranking factors.

Of course, all of this is also influenced by user experience. Very important elements are:

Penguin

In April 2012, Google launched Penguin.

The purpose of this update was different from Panda: checking the quality of incoming links to position pages in search results.

Before that, manipulation was constant. Strategies that are now considered black hat, and obviously punished by the mechanism, were common.

A good example is a practice called link farm (link farm, in free translation). Basically, it consists of creating several pages and linking to the same anchor text, that is, passing authority to a landing page.

This is not very beneficial for the user, since references should be made naturally and send the reader to content that is also relevant.

Another recurring action was the creation of private blog network (PBN), which consists of the affiliation of several blogs creating links between them.

And, amazingly, there was (and probably still is) the practice of buying links, that is, inadequate charging for relevance. Often, the links came in posts that were not even related to the topic of the texts.

These practices ended with Penguin, which started to consider the authority of the page and the domain.

Think about the following situation:

A dance teacher asks beginners students a question. The answers are varied, and each student has his own opinion on the subject.

However, among the students, none other than Michael Jackson is infiltrated. It is obvious that his answer has greater relevance, since he is one of the greatest dancers that humanity has ever had.

Keep this in mind:

A link on a relevant website means better placement. A reference on the portal administrators.com.br, for example, will pass an authority higher than a link on a personal blog.

Pirate

A few months later, in August 2012, the search engine launched Pirate.

The update’s name says it all: the goal was to avoid ranking sites that promoted piracy. That is, pages with many reports of content that violate copyright.

That’s right, sites with free downloads or distribution of music, movies, series, books and other products protected by copyrights.

It is because of this update that torrent sites, like The Pirate Bay or KickassTorrents, change hosting regularly. Without that, they would be caught by Pirate and deindexed.

Hummingbird

The following year, in August 2013, Google launched Hummingbird. It represented a major turning point, and many professionals define SEO as “AHDH”: before and after Hummingbird.

This update had a purpose: to semantic search based on user intent, rather than exact keywords. In other words, the research started to consider synonyms and similar and related expressions.

Thus, even if an internet user searches for inaccurate terms, he will find what he is looking for. This most directly affected the longest searches, with specific questions.

That update was a big problem for those who practiced the keyword stuffing, since pages that didn’t repeat the keyword many times – but that answered the user’s doubts – could have a better positioning.

This has obviously strengthened competition for long tail keywords and, to this day, you can find new relevant search terms. A full plate for business, as this represents an increasingly segmented audience.

A new type of SEO has also been introduced here, which takes conversation research into account. Basically it is the adaptation to the way the user searches, that is, in a more humane way.

After all, hardly anyone will search for “time in the city of São Paulo”. Instead, it’s much easier to search for “what time is it?”.

A great way to find out how your audience does their research is to use the autocomplete and related keywords present in the search engine itself.

Here is a very didactic post on the topic, in case you are more interested.

Pigeon

In July 2013, Pigeon, the 5th Google update, was launched. It is a update to benefit local search.

This represents a major transformation, as it can receive different results depending on the geographic point where the user searches.

Think of the following search term: “cruise”.

If this search is carried out in a place where coastal tourism is a strong point, you are likely to want information about a sea cruise.

Now, if the user is a supporter of Cruzeiro, whose majority of supporters are concentrated in Minas Gerais (a state with no connection to the sea), he will probably seek information about the mentioned team.

It was a clever way to cross-reference Google Maps data with user search intentions.

It also helps companies a lot, since local SEO is considered. For this reason, it is very important that businesses register with Google My Business.

Mobile Friendly Update

In April 2015, Google launched the Mobile Friendly Update, which aimed to prioritize sites that feature responsive design.

Google noted that a large part of the searches conducted were from mobile devices.

As the priority has always been the user, there is nothing more fair than penalizing pages that do not have a friendly design.

After all, nobody wants to access a website and not be able to access the content, right?

Desktop searches were not affected. However, entrepreneurs need to keep in mind that the content must be accessed through any mechanism.

Exclusive desktop plugins, images that take too long to load or overlap content and other features on page can severely affect the ranking of the site.

In this post, you can learn more information about this update.

The concept of mobile first it seems to be increasingly present in the daily lives of companies. After all, the number of searches performed on mobile devices is already higher than on fixed, and the trend is for it to grow more and more.

Rank Brain

Launched in October 2015, Rank Brain was an update that uses the machine learning (machine learning, in London).

If you don’t know what that is, don’t worry.

Let’s explain in a very didactic way:

Basically, it is the use of an algorithm that, from collecting data, is able to adapt automatically. In very simple terms, it is a robot programmed to think. When new data comes, it fits.

When we talk about Rank Brain, we are talking about meanings. That is, when a user searches, what exactly does he mean by that?

Imagine that a user searches for the term “find documents”.

He may be looking for “lost and found” documents, searching for documents from his ancestors or simply wanting to find a specific file among the many folders he has on Google Drive.

Google identifies the most clicked results and, from there, knows what the user wants when doing this search.

Did you get it?

In short, the objective of the update is to decipher the user’s doubts.

When launched, Google itself ranked it as one of the 3 most important ranking factors.

Possum

In September 2016, Google launched the Possum update. That was another update focused on location – but business oriented.

Realize that with this update, the closer the user is to a business, the greater the chances of the enterprise appearing in a search result.

Fred

The Fred update took place in March 2017. Little is known about this update, but it has been confirmed by Google itself.

The company restricted itself to saying that the update would affect only those who violate the webmaster guidelines.

However, some researches claim that penalties were applied to sites whose objectives were marketing above the delivery of quality content, such as articles aimed solely at generating revenue through the placement of ads.

How to avoid being penalized for Google updates

If you’ve made it this far, you probably want a little help to avoid being penalized for Google updates, don’t you?

So, below, see a list of tools that can be useful to maintain the good ranking of your pages:

These are all Google updates identified so far.

Knowing them means a step ahead of the competition. Even more so if your business adapts to each one.

Despite this, the algorithm will certainly continue to receive new updates. So it is very important to keep an eye on each one.

So, did you like this post? So how about knowing now what Google’s penalties are when a website violates the Webmaster guidelines? Access this text and check it out!