How Google uses AI for Search

Posted by Edith MacLeod on 8 Feb, 2022
View comments Search News
RankBrain, Neural matching, BERT and MUM. Here's what they each do and the role they play in search and rankings.

Google AI.

Google has published an outline of their major AI systems in the context of Search.  It's a useful summary of their advances in the field of AI and machine learning as applied to Google Search, showing an increasingly sophisticated understanding of the human intent behind queries.

The post gives an outline of how these AI models work and how they are currently being used or developed in relation to search and rankings.

  • RankBrain
  • Neural Matching
  • Bert
  • MUM

The models work together in various combinations to deliver the most relevant results, performing various functions, each with a specialized role.

So, if you’ve heard about them, but aren't quite sure exactly what they do and how they affect the search results, read on.

RankBrain

Launched in 2015, RankBrain was Google’s first deep learning system used in Search and is still used today. As the name says, it’s used to help rank results and Google describes it as continuing to be “one of the major AI systems powering Search today.”

RankBrain helps Google understand how words relate to real world concepts through a broader understanding of the terms in a search query. Google gives the following example:

For example, if you search for “what’s the title of the consumer at the highest level of a food chain,” our systems learn from seeing those words on various pages that the concept of a food chain may have to do with animals, and not human consumers.

By understanding and matching these words to their related concepts, RankBrain understands that you’re looking for what’s commonly referred to as an “apex predator.”

Neural matching

Google introduced neural matching to Search in 2018. It looks at an entire query or page rather than just keywords, and helps to understand “fuzzier representations” of concepts. This understanding of the broader concepts on a page helps Google systems cast a wider net when scanning the index for content relevant to the search query.

Google describes neural matching as a “critical part” of how they retrieve relevant content from a massive and constantly changing information stream.

Here’s Google’s example of how neural matching works:  

Take the search “insights how to manage a green,” for example. If a friend asked you this, you’d probably be stumped. But with neural matching, we’re able to make sense of it.

By looking at the broader representations of concepts in the query — management, leadership, personality and more — neural matching can decipher that this searcher is looking for management tips based on a popular, color-based personality guide.

Neural matching was applied to local search in 2019.

BERT

Bert is all about meaning and context. Launched in 2019, Google describes it as a huge step change in natural language understanding.

Rather than searching content to match individual words, BERT understands how a combination of words can express different meanings and intents. It comprehends words in a sequence and how they relate to each other to express a more complex idea.

Here’s Google’s example of how it works:

For example, if you search for “can you get medicine for someone pharmacy,” BERT understands that you’re trying to figure out if you can pick up medicine for someone else.

Before BERT, we took that short preposition for granted, mostly sharing results about how to fill a prescription. Thanks to BERT, we understand that even small words can have big meanings.

BERT is used for search and ranking and Google says it plays a critical role in almost every English query. BERT excels at ranking and retrieving, which are two of the most important tasks in serving relevant results.

BERT plays a major role in Search and is part of a number of systems working together to deliver quality results.

MUM

Launched in May last year, MUM is Google’s latest advance in Search. It is able to both understand and generate language, currently supporting 75 languages. MUM is capable of multitasking and is multimodal, meaning it can understand information across multiple modalities such as text and images.

MUM has many promising potential applications but is still in the early stages of development. It is not currently being used for ranking or improving search results in the same way as the AI mentioned above. However, it does already have a couple of specialized applications. It is being used to help improve searches for Covid vaccine information and Google says it will be offered in the coming months as a more intuitive way to search using a combination of text and images in Google Lens.

Google describes MUM as a thousand times more powerful than BERT. See our blog post for further information about how Google says MUM will change the Search landscape.

Recent articles

Google springs December 2024 core update
Posted by Edith MacLeod on 12 December 2024
Google Search Console recommendations now live for all
Posted by Edith MacLeod on 3 December 2024
Google tightens site reputation abuse policy
Posted by Edith MacLeod on 28 November 2024
How to increase website traffic with email marketing
Posted by Maria Fintanidou on 26 November 2024
Google retires Page Experience report in Search Console
Posted by Edith MacLeod on 19 November 2024