While Google can be rather secretive about it's ranking methods, they do share tidbits with us from time to time.  These little pieces of information are then dissected by the Search Engine Optimization (SEO) community, compared to other information as well as performance data, and on and on.  All to try to determine both what Google metrics are used for ranking, and what metrics could be used in the future.  It was recently shared that RankBrain is the third most important ranking signal.  But what is RankBrain?

Algorithmic Ranking

An algorithm is a single or series of mathematical calculations designed to give a result.  Google uses algorithms to rank various factors of websites in an effort to determine where they should rank in the Search Engine Results Page (SERP).  Before the advent of RankBrain, the process of developing, testing, and deploying these algorithums was done by hand.  Small changes took thousands of man hours of development, and even more time in testing before they were finally deployed.  This is a time consuming process, allowing Google to evolve at a slow pace.

Google RankBrain

RankBrain is a Artificial Intelligence (AI) system that employees Machine Learning (ML) to sort search results and understand search queries.  That's a lot of information for such a short sentence, let's break it down a little.

We will start by briefly defining what AI and Machine Learning is.  AI is a system can perform tasks that a human would consider "smart".  Machine Learning is a process where an AI is trained to make predictive statements or decisions through the examination of large amounts of data.  In reference to Rank Brain, the AI is continuously evolving the ranking algorithm, speeding up the development, testing, and deployment process from months or years, to microseconds.  What's more, this process is ever evolving, allowing for continuous improvement.

RankBrain is not new.  Back in 2015 Google revealed that a very large fraction of the millions of search queries Google handles every second have been "interpreted by an AI system, nicknamed RankBrain".  What's more, Google placed RankBrain against it's engineers to see how they faird in guessing which sites would be placed above others.  The engineers achieved a success rate of 70%, where RankBrain achieved 80%.  While Google admits this was a very informal test, it has not changed the fact that RankBrain is the future of the Google Search Algorithum.

What it Does

RankBrain has two main tasks:

  1. Understand the Search Query, be it words or phrases.
  2. Evaluate how users interact with the results.

Before this system, Google had a problem.  15% of search queries were never seen before.  With a company that processes millions of queries a second, this translates into billions of words and phrases each day that stumped the current Google Search Algorithms.  At this time Google would scan websites for matching queries (words or phrases), which is where these never before seen queries were so troubling, they did not yet exist anywhere.  To do this Google had to move from words/phrases and into concepts and context.  This is where Latent Symantec Indexing started to become important.  By creating a system to learn the meaning of the query, and not just match the words, Google is able to return valueable results even on queries they have never seen and that don't exist on the entirety of the internet.

Determining Value

Google is the dominate search engine, at least at this moment.  They understand that their position is as it is because they provide a service users find valuable.  Meaning that when a user searches, they find a result that matches their expectation.  To continue being the dominate search engine they need to consistently improve the quality of the results, or risk being usurped by their competition.  Determining the value of a result is a difficult task, and to be successful Google understood that user behavior after they click on the result is where they need to look.  This is part of the motivation behind services such as Google Analytics, Google Maps, Google Fonts, etc.  All these free services provided by Google are really a way for them to watch how a user interacts with a website after they click a link from their search engine.  While that might sound scary, it's a practice employeed by every company on the web, from Facebook to AdServe.

To determine the value of a result Google looks at the following User Experience (UX) signals:

Dwell Time

Dwell Time is the amount of time a user stays on a given webpage, or the website as a whole.  The question we need to ask is "how long do we need to keep a user to please google?".  As with anything related to SEO, the actual answer is always changing along with the algorithms employed.  As of the writing of this blog the ideal time is an average of 5:00.  That's a long time to stay on a single web page, it equates to roughly 2,000 words.  This is a huge challenge for those websites that frequently blog.  Before you get on the metaphorical ledge there is good news.  As the old adage goes, to our run a bear you only need to be faster then the person next to you.  SEO is no different, to be ranked high you need to be better then your competition.  Local search, and poor quality SEO firms improve the odds for those pages that don't achieve 2,000 words. Going through all the ways we can work with a dwell time requirement of roughly 5:00 will take up a whole series of blog posts, so check back here often to learn more.

Google watches how much time a user spends on a page, or the website as a whole, as the dwell time changes, so to will the ranking of the page within the Google SERP.

Bounce Rate

The Bounce Rate for a website is determined by the percentage of users that visit a single page then move on elsewhere.  A high bounce rate is not ideal, but it's not always bad.  Sometimes a user finds their answer in a single page.  Google understands this, which is why there are multiple UX signals examined by RankBrain.  As the bounce rate changes, so to will the SERP ranking within Google.

Pogo Sticking

Unlike Bounce Rate, Pogo Sticking is considered very, very bad by Google. While similar to, and commonly confused with Bounce Rate, Pogo Sticking is the action of leaving a website within 5 seconds to return to the SERP.  Frequent Pogo Stick visitors will dramatically drop a website's ranking.  This is commonly caused by either poor web design, or bad content.  Thankfully both are fixable problems.

Organic Click Through Rate (CRR)

Organic Click Through Rate (CTR) is the rate of clicks as compared to the number of times a result appears within the SERP.  While CTR is a term commonly associated with online advertising, the inclusion of the word organic means that it's a non-paid for click primarily through the SERP.  Google wants to know how often your result is shown, versus how often it is clicked.  If it's never clicked, Google will find other results that better fit the users query there by dropping your sites placement within the Google Index.  Improving article titles, and descriptive text can imrpove the CTR.  A good SEO specialist will always be testing various titles, or content to determine the best result for a given word or phrase.

Revolutionary

SEO has changed, and for the better.  In the early days SEO was lazy, relying on keyword stuffing and purchasing of backlinks  to improve a websites ranking.  High performing webpages were barely legible or useful, focusing on keyword stuffing over readability.  With systems like RankBrain, Google is working to reward websites that have quality content.  The use of this AI is a game changer.