This month, Google announced a range of updates to search in their Search On live stream and blog. The updates covered during the event are focused on improving the AI that controls search, intending to provide the searcher with the most relevant search result. Key areas of focus included improvements to search algorithms, updates to how content will be ranked, and how video content will be displayed in search results. However, one of the biggest reveals was that Google is employing BERT in virtually every search query.
Google BERT is now used in almost 100% of search queries. When it was first released, BERT was used in around 10% of cases. Now, BERT will impact nearly every English search query.
According to Google: “Today we’re excited to share that BERT is now used in almost every query in English, helping you get higher quality results for your questions.”
So, what does this Google update mean for your digital marketing strategy? What will you have to change to optimize for BERT? And what the heck does BERT actually do?
BERT’s addition to Google’s search engine means you have to follow search optimization best practices and stick to creating authentic, naturally-written content, and you’re good to go. BERT is now going to rule the search query, so in this blog, we will discuss everything you need to know about BERT. But, to understand BERT, we need to understand NLP.
What is Natural Language Processing?
Image Source: Forbes
Natural language processing is a kind of artificial intelligence that helps machines understand human language. Machines speak in code with loads of zeros and ones that translate into processes or actions. NLP helps to break down human language into a form that computers can understand to process our language and take appropriate action in response.
An example of NLP in our everyday lives is when you ask Alexa to play a song or tell Siri to add an appointment to your calendar. Your device hears the words, translates them into its language (zeros and ones), derives the meaning from the phrase, and can form actionable items out of those words.
Despite how far NLP has come, it still has its restraints. You’ve probably asked Alexa a question she didn’t understand, whether due to its length, context, or ambiguity.
The trouble with NLP is the fact that human language is infinitely complex. Simple nuances and inflections can change the meaning of a phrase, and words with different meanings like “rose,” which can be a colour, a flower, or a name, can only be deciphered correctly with context clues. Computers have a hard time processing these kinds of subtleties. Yet, as NLP advances, computers should begin to develop natural language understanding, a step above NLP. This would be a significant breakthrough in artificial intelligence that would allow machines to comprehend human language nuances. (The latest Google AI update has this area now covered)
Alright, So What is Google’s BERT?
BERT, or Bidirectional Encoder Representations from Transformers, is a natural language processing model that uses (and advances) 11 NLP solutions already developed better to understand language, including search queries in Google. Before BERT, many NLP solutions, including Google search, would process language more like a robot, breaking down each word one by one, instead of looking at the phrase as a whole.
BERT turns this one-by-one approach on its head by looking at the words that come before and after each term in a search query to understand the context better. One area where we see noticeable improvement is the use of prepositions. Common words like “to” and “from” would usually be ignored in a search query, as if they were simply an article like “the.”
Yet, these words are essential to meaning in many different contexts, such as the example Google gives in their blog: A searcher types in “2019 brazil traveller to the USA need a visa” and gets the following results:
Image Source: Google Blog
The word “to” and its relationship to the other words in the query are of paramount importance here. In the search result before BERT, “to” is ignored, and the search engine delivers an imprecise answer. With BERT, the whole phrase is taken into account, including that little word “to,” and a much more precise result is given — one that genuinely answers the searcher’s question.
What’s New About BERT Compared to the Previous Google Algorithm
BERT is the newest Google algorithm update in a long succession of changes that the company has made to improve its systems. In the past, some of the major updates include:
- Panda – Originally known as Farmer, the Panda update was incorporated into the existing Google Algorithm in 2011.
- Hummingbird – Released in 2013, Hummingbird allowed Google to interpret entire phrases rather than focusing on individual words.
- RankBrain – The 2015 Google algorithm update enabled the system to process search queries with multiple meanings, including colloquialisms, dialogues, and neologisms.
What makes BERT so Unique?
The BERT update allows Google to understand “text-cohesion” and determine between phrases and sentences, specifically where polysomic nuances could alter words’ contextual meaning.
Several elements make BERT so unique for search and beyond (the World – yes, it is that big as a research foundation for natural language processing). Several of the unique features can be found in BERT’s paper title – BERT: Bi-directional Encoder Representations from Transformers.
B – Bi-Directional
E – Encoder
R – Representations
T – Transformers
But there are other exciting developments BERT brings to the field of natural language understanding too.
- Pre-training from unlabelled text
- Bi-directional contextual models
- The use of a transformer architecture
- Masked language modelling
- Focused attention
- Textual entailment (next sentence prediction)
- Disambiguation through context open-sourced
How BERT Gives Google a Better Understanding of Search Queries?
Google uses BERT to analyze search queries. The algorithm update improves Google’s understanding of context, especially with longer queries. When long-tail keywords and questions are more common — because of the rise in voice search queries — Google must know what users want.
Here are a couple of examples from Google to demonstrate how BERT will help Google provide more relevant results in the future:
Example 1: “Parking on a hill with no curb.”
Before BERT, Google would analyze this query with more emphasis on the word curb, and it would overlook the word no. With the Google algorithm update, the search engine now considers every word, ensuring you get the results you want.
Example 2: “do aestheticians stand a lot at work.”
Before, Google took the approach of matching keywords. This query may have yielded results with the term “stand-alone,” even though that isn’t the correct use of the word stand from the original query. With BERT, Google will know the word stand relates to the concept of the physical demands of the job in question.
How Does BERT Affect Search Queries?
Now that BERT impacts nearly 100% of searches, it is very crucial to understand this part. To get a sense of how this Google algorithm update will affect user search queries, we need to break it down to consider various algorithm elements.
- # Pre-training from an unlabeled text – BERT is the first NLP framework pre-trained on pure plain text instead of labelled corpora. This training was conducted through unsupervised learning.
- # Bi-directional contextual models – True contextual understanding is only possible when you can see all the words in a sentence simultaneously — giving you a sense of how every word impacts the other words in the sentence. As a machine, Google has historically struggled with this context — until now.
- # Transformer Architecture – Reinforcing the point above, using transformer architecture allows Google to analyze each word with the other search query terms. Without this, the question could have an ambiguous meaning.
- # Masked Language Modeling – By design, the BERT architecture will analyze sentences that have some words randomly masked out. It then attempts to predict what the “hidden” word is correctly and factor into its determination of the search query intent.
- # Textual entailment (next sentence prediction) – One of BERT’s most exciting features is its ability to predict what you will say (or type) next.
How Does the Google Algorithm Update BERT Affect SEO
As BERT focuses on query evaluation at the phrase or sentence level, content creators should use a mix of short and long-tail terms in their content.
More importantly, you should create in-depth content that explores topics in greater detail, offering valuable information to your audience. Doing this provides BERT with more details to analyze so that it can determine the context of your content.
How Does Google BERT Affect Content Marketing?
Now that Google is working harder to discern user search queries’ context and intent, content marketers must ensure they deliver higher quality content relevant to their target readers.
BERT is a huge step forward in developing search engines, so companies need to get in line to generate a positive return on investment (ROI) from their content marketing efforts.
Content writers must write for people — not search engines. With the latest Google algorithm update, your content should be more “human,” using the terms, tone, and language that closely correlates with your audience.
As voice search and mobile use continue to grow, people demand faster, more accurate answers to their questions. Google is evolving to figure out user intent and to focus on delivering results for that rather than for the precise keywords that the user entered.
How Did BERT Impact Your Website?
Not so long ago, when BERT was launched, did you see your organic traffic slump, especially when it came to Featured Snippets and voice search? The chances are that BERT had hit you hard.
The BERT update release showed Google is serious about nurturing voice SEO and technology, as it is all about delivering better contextual search results.
This Google update is all about content focus and user intent, so some technical changes like page speed and design are unlikely to turn things around. Instead, you must look to improve your content’s quality to be more current, comprehensive, relevant, and actionable. You can use Audience Mapping to ensure your content strategy always has the target audience in mind.
However, something to point out regarding the recent update is that it wouldn’t impact a site’s ranking exactly. SEOs cannot optimize for BERT per se.
Instead, BERT is designed to improve the results’ relevancy by a better understanding web pages’ content.
So, What Changes After This Update on Search?
Moving forward, BERT will impact nearly every English language search query.
This means that Google Search understands users’ searches better than ever before.
– Understanding misspelled words, thanks to the new spelling algorithm.
– Understanding the context of misspelled words.
– Providing users to find the right results in under 3 milliseconds.
Image Source: Google Blog
– Indexing not only web pages but individual passages from the pages.
– This will help Google to improve 7% of search queries across all languages.
– Now, Google can rank a specific passage to a particular query (see the example below)
Image Source: Google Blog
More relevant subtopics for a specific query:
– Google can understand the relationship between specific searched queries and subtopics around them, thanks to neural nets (algorithms).
– Subtopics on the Search will help provide a diversity of content when users search for something broader.
– Google will roll out this feature by the end of 2020.
How Will These Updates Impact?
1. Website Traffic
If we compare SERP (Search Engine Results Pages) results of Google between May and October 2020, we see that Google’s core update impacted more websites and their rankings from May 5th to May 7th, 2020.
However, the latest update in October 2020 shows a high duration (yellow) almost for the whole month. This means that the rankings have been volatile and shows an active movement.
Image Source: SEMrush Sensor
How to avoid it?
Since BERT is impacting informational related keywords, for example, ‘how to tie a tie,’ you should focus on creating your content as specific as possible around the searched topic.
Neil Patel says: ”Typically when you create content, which is the easiest way to rank for informational related keywords, SEOs tell you to create super long content. Yes, you may see that a lot of longer-form content ranks well on Google, but their algorithm doesn’t focus on word count, it focuses on quality.”
# Improve your SEO strategy by keeping long-tail keywords in mind. By doing keyword research and creating the content, answer the searcher’s questions better than your competitors and get straight to the point.
2. Limited access to Manually Index Pages
Since Google is making updates related to Search, it has limited access for publishers to manually index pages in Google Search Console. This is still an ongoing process, and there is no information on how long it will last.
Image Source: Data anomalies Search Console
What can you do?
According to Google, you have to be patient; Google will continue to find and index your content through their standard methods–first by crawling the page, then indexing, and finally ranking on Search.
Due to this latest Google update, there is a possibility that your traffic may go down. It is also possible that the latest core update did not impact your website. However, your goal should always be to provide great content and a better user experience for your audience, so it’s worth fixing any existing errors on your website.
Neil Patel says, and we couldn’t agree more, “It isn’t about winning on Google. SEO is about providing a better experience than your competition. If that’s your core focus, in the long run, you’ll find that you’ll do better than your competition when it comes to algorithm updates.”
Analyze your pages that don’t rank anymore and improve the content or create new content that answers questions that people are searching for.
Do keyword research and focus on long-tail keywords but please don’t overdo it; BERT helps Google better understand search queries, including those written in natural language and those with prepositions and “stop words” that add to the meaning of the question. Some people have misinterpreted this fact and assumed that it’s now essential to optimize websites for ONLY long-tail queries. Remember, BERT was designed to understand users’ intent and connect that to the information that already exists on your website. Understand what users have exactly searched for and always try to give more precise answers than your competition.
While you can’t optimize for BERT, you can work hard to make your website as engaging as possible for your desired audience. Focus on publishing high-quality content, and write for people (not search engines). As the Beatles sang, “All you have to do is act naturally.”