Google algorithms | Weboptim

Google uses algorithms to try to help users find the content they are looking for. In the past, this was done by setting meta tags, for example, which told Google what the page was about.
 
However, as Google has evolved, the focus has increasingly shifted to providing relevant content to visitors during searches. Algorithms use hundreds of factors to determine which web pages are the most relevant for a given search term.
 

In the last few years, there have been 3 big changes in Google algorithms.

1. The Panda algorithm

Google Panda

The first version February 2011 was published. The aim was to ensure that high quality websites would get better results in searches and low quality ones would be ranked lower.
Initially it was called the "farmer update" because it focused on content farms (content farms are sites that created low-quality content focused on Google keyword searches.)
 
So the Panda algorithm for websites quality and reliability build.
 
Experts have produced a checklist of questions to answer to determine whether a tomorrow is of good quality:  googlewebmastercentral.blogspot.ca/2011/05/more-guidance-on-building-high-quality.html
 
But no one knows exactly all the factors that Google uses to determine quality. The focus is on creating content that is valuable to users. But what are the most important factors?

 

  • avoiding narrow content: short pages are generally not useful. Even if a webpage has many indexed pages with a few lines of text, Google will rank it as low quality.
  • no duplicate content: most of the content is stolen from other sites, on-page duplicates - the 2 main ways in which the Panda algorithm can classify your website as poor quality.
  • filtering out low quality content: focus on creating unique and valuable content

What can we do about it?

The Panda algorithm is updated by Google every month or so, and it always revisits all pages after the changes, so if someone removes low-quality and duplicate content in the meantime, the page's rating can be restored (although this can sometimes take longer).
 

2. The Penguin algorithm

Google Pingvin

April 2012 were introduced. It aims to filtering out unnatural link profiles was. Since the update, new, but not natural, links are also a threat to the ranking of the website.
The link is a vote for the website. The more trusted the site, the more valuable it is. If you have a lot of links from poor quality websites, it will hurt the site's image in Google's eyes.
 
When checking links, it is also important to consider the so-called. anchor text (link text that takes the user to the website). If you use a link text for many links, Google perceives that your page is relevant to that text. However, it was very easy to influence this by placing keyword links, and Penguin is designed to prevent this.
 
The algorithm is also a trust factor. If you get a lot of links from untrustworthy sites, it will affect the trustworthiness of the website and therefore its ranking.
 

What can we do about it?

Like Panda, it occasionally revisits and evaluates websites. If the algorithm has caused the site to deteriorate a lot, you should look for unnatural links and remove them, or ask Google to ignore them when evaluating the site: google.com/webmasters/tools/disavow-links-main
 

3. A Hummingbird algorithm

Google HummingbirdAugust 2013 introduced by Google. Unlike the other updates, this search algorithm affects the whole operation.
 
Its aim is to help Google better understand how users search: from search terms from in a given context try to interpret it and provide the best possible solution.
 
Its special feature is that it provides a more immediate and precise answer to questions of the question-and-answer type. So, if someone searches not for "the best restaurant in Budapest" but for "where is the best restaurant in Budapest?", they will still get a relevant answer to their question.
 

What can we do about it?

Hummingbird is completely different from Panda and Penguin. The latter two score the reliability of websites (because of links or content), but by fixing problems you can regain Google's trust. However, if a site falls down because of Hummingbird, there is little chance of restoring keyword rankings.
 

The algorithm changes are designed to encourage site owners and webmasters to create and publish the best possible content. It is important that content is created for users, as Google's goal is to help people find what they are looking for. If we can answer people's questions/searches, we are on the right track.

 
Source: moz.com
 
 

Did you like the article? You can share it here!


References
Please contact
for a quote!