Overview of AI used in Google Search

Overview of AI used in Google Search

An Overview of AI used in Google Search
How Google uses artificial intelligence in Google Search, from RankBrain, Neural Matching, BERT, and MUM – here is how Google uses AI for understanding language for query, content and ranking purposes. The following is a summary of a longer posting on the Search Engine Land blog.

Overview of AI used in Google Search

RankBrain
It starts with RankBrain, Google’s first attempt at using AI in search dates back to 2015. Google told us RankBrain helps Google understand how words are related to concepts and can take a broad query and better define how that query relates to real-world concepts.

While it launched in 2015 and was used in 15% of queries, Google said it is now, in 2022, widely used in many queries and in all languages and regions. RankBrain does specifically help Google rank search results and is part of the ranking algorithm.

Year Launched: 2015
Used For Ranking: Yes
Looks at the query and content language
Works for all languages
Very commonly used for many queries

Here is an example provided by Google of how RankBrain is used, if you search for “what’s the title of the consumer at the highest level of a food chain,” Google’s systems learn from seeing those words on various pages that the concept of a food chain may have to do with animals, and not human consumers.

By understanding and matching these words to their related concepts, RankBrain helps Google understand that you’re looking for what’s commonly referred to as an “apex predator.”

Neural matching
Neural matching was the next AI Google released for search, it was released in 2018 and then expanded to the local search results in 2019. In fact, we have an article explaining the differences between RankBrain and neural matching over here.

Google told us neural matching helps Google understand how queries relate to pages by looking at the entire query or content on the page and understanding it within the context of that page or query. Today, neural matching is used in many, if not most, queries, for all languages, in all regions, across most verticals of search. Neural matching does specifically help Google rank search results and is part of the ranking algorithm.

Year Launched: 2018
Used For Ranking: Yes
Looks at the query and content language
Works for all languages
Very commonly used for many queries

Here is an example provided by Google of how neural matching is used, if you search for “insights how to manage a green,” for example. Google said “if a friend asked you this, you’d probably be stumped.” “But with neural matching, we’re able to make sense of this quizzical search. By looking at the broader representations of concepts in the query — management, leadership, personality and more — neural matching can decipher that this searcher is looking for management tips based on a popular, colour-based personality guide,” Google told us.

BERT
BERT, Bidirectional Encoder Representations from Transformers, came in 2019, it is a neural network-based technique for natural language processing pre-training. Google told us BERT helps Google understand how combinations of words express different meanings and intents, including looking at the sequence of words on a page, so even seemingly unimportant words in your queries are counted for.

When BERT launched, it was used in 10% of all English queries but expanded to more languages and used in almost all English queries early on. Today it is used in most queries and is supported in all languages. BERT does specifically help Google rank search results and is part of the ranking algorithm.

Year Launched: 2019
Used For Ranking: Yes
Looks at the query and content language
Works for all languages but Google said BERT “plays a critical role in almost every English query”
Very commonly used for many queries

Here is an example provided by Google of how BERT is used, if you search for “if you search for “can you get medicine for someone pharmacy,” BERT helps us understand that you’re trying to figure out if you can pick up medicine for someone else. Before BERT, we took that short preposition for granted, mostly surfacing results about how to fill a prescription,” Google told us.

MUM
MUM, Multitask Unified Model, is Google’s most recent AI in search. MUM was introduced in 2021 and then expanded again at the end of 2021 for more applications, with a lot of promising uses for it in the future.

Google told us that MUM helps Google not just with understanding languages but also generating languages, so it can be used to understand variations in new terms and languages. MUM is not used for any ranking purposes right now in Google Search but does support all languages and regions.

Year Launched: 2021
Used For Ranking: No
Not query or languages specific
Works for all languages but Google not used for ranking purposes today
Used for a limited number of purposes

Currently, MUM is used to improve searches for COVID-19 vaccine information, and Google said it is “looking forward to offering more intuitive ways to search using a combination of both text and images in Google Lens in the coming months.”

As explained above, Google uses RankBrain, neural matching, and BERT in most queries you enter into Google Search, but Google also has core updates. The Google broad core updates that Google rolls out a few times per year is often noticed by site owners, publishers, and SEOs more than when Google releases these larger AI-based systems.

But Google said these all can work together, with core updates. Google said these three, RankBrain, neural matching, and BERT are the larger AI systems they have. But they have many AI systems within search and some within the core updates that Google rolls out.

Search

Table of Contents

Send Us A Message

Share: