BERT: Google’s New Algorithm That Promises to Revolutionize SERPs
Google is already such an intricate part of people’s lives that many of us chat directly with it.
Users type “how do I get to the market” or “when does Spring start”, as if they were naturally talking to a person. But it is worth remembering: Google is made of algorithms.
And it’s one of those algorithms — Google BERT — that helps the search engine understand what people are asking for and brings the answers they want.
That’s right: bots are not people, but technology has advanced so much that they can understand human language, including slang, errors, synonyms, and language expressions present in our speech, and we don’t even notice.
This new search algorithm was created by Google to better understand users’ search intentions and contents on web pages.
But how does it work? And how does it affect your SEO strategies?
Let’s understand it all now:
- What is Google BERT?
- When was BERT released?
- What is NLP?
- Did BERT replace RankBrain?
- How does Google BERT work?
- Why is Google BERT important for the search experience?
- What are the impacts of BERT on SERPs?
- Content and SEO: how to optimize for BERT?
What is Google BERT?
Google BERT is an algorithm that increases the search engine’s understanding of human language.
This is essential in the universe of searches since people express themselves spontaneously in search terms and page contents — and Google works to make the correct match between one and the other.
BERT is the acronym for Bidirectional Encoder Representations from Transformers. Confusing? Let’s explain it better!
To understand what BERT is, we’re going to need to go through some technical terms, ok?
For starters, BERT is a neural network.
Do you know what that is?
Neural networks are computer models inspired by an animal’s central nervous system, which can learn and recognize patterns. They are part of machine learning.
In BERT’s case, the neural network is capable of learning the forms of expression of human language. It is based on a model of Natural Language Processing (NLP) called Transformer, which understands the relationships between words in a sentence, rather than viewing one by one in order.
BERT is a pre-training model of natural language processing. This means that the model’s data set is trained in a text corpus (like Wikipedia) and can be used to develop various systems.
It is possible to develop algorithms focused on analyzing questions, answers, or sentiment, for example.
All this is in the field of artificial intelligence. That is, bots do everything!
Once programmed, the algorithm continuously learns about human language by processing the millions of data it receives.
But beyond the world of artificial intelligence that looks more like science fiction, it is essential to know that BERT understands the full context of a word — the terms that come before and after and the relationships between them — which is extremely useful to understand the contents of sites and the intentions of users when searching on Google.
When was BERT released?
In November 2018, Google launched BERT in open source on the GitHub platform.
From then on, anyone can use BERT’s pre-trained codes and templates to quickly create their own system.
Google itself used BERT in its search system. In October 2019, Google announced its biggest update in recent times: BERT’s adoption in the search algorithm.
Google had already adopted models to understand human language, but this update was announced as one of the most significant leaps in search engine history.
Initially, BERT was launched only in the United States and in English. But by December 2019, the model had already been expanded to over 70 languages. This way, the search results all over the world gained a great deal of quality.
What is NLP?
To explain what BERT is, we mentioned that this algorithm is a model of Natural Language Processing (NLP). Allow me to explain.
NLP is an artificial intelligence area that converges with linguistics when studying human and computational languages’ interactions. The intention is to fill in the gaps between one language and another and make them communicate.
This type of system has existed for a long time, since Alan Turing’s work in the 1950s.
But it was in the 1980s that the NLP models left their manuscripts and were adopted into artificial intelligence. Since then, computers have been processing large volumes of data, which has revolutionized humans and machines’ relationship.
We may not notice it in our daily lives, but our verbal expression is extremely complex and diverse.
There are so many languages, syntactic rules, semantic relationships, slangs, sayings, abbreviations, and daily mistakes that, at times, humans can barely understand each other!
This becomes even more difficult for computers since we use an unstructured language for them, which then need systems in order to understand it.
For this, NLP adopts a series of techniques, such as abstracting what is irrelevant in the text, correcting spelling mistakes, and reducing words to their radical or infinitive forms.
From there, it is possible to structure, segment, and categorize the content to understand how the parts make sense together. Then, the system also elaborates an answer, in natural language, to interact with the user.
This kind of system allows, for example, you to say “Alexa, tell me the recipe for a chocolate cake”, and Amazon’s virtual assistant responds with the ingredients and the method of preparation.
This solution is used today in several resources, such as interaction with chatbots (image below), automatic translation of texts, analysis of emotions in social media monitoring, and, of course, Google’s search system.
Did BERT replace RankBrain?
Google is continuously studying ways to improve user experience and deliver top results. This does not begin or end with BERT.
In 2015, the search engine announced an update that transformed the search universe: RankBrain.
It was the first time the algorithm adopted artificial intelligence to understand content and search.
Like BERT, RankBrain also uses machine learning but does not do Natural Language Processing. The method focuses on query analysis and grouping words and phrases that are semantically similar, but cannot understand the human language on its own.
So, when a new query is made on Google, RankBrain analyzes past searches and identifies which words and phrases best match that search, even if they don’t match exactly or have never been searched.
As they receive user interaction signals, the bots learn more about the relationships between words and improve ranking.
Therefore, this was Google’s first step in understanding human language. Even today, it is one of the methods used by the algorithm to understand search intentions and page contents in order to present better results to users.
So, BERT did not replace RankBrain — it just brought another method of understanding human language. Depending on the search, Google’s algorithm can use either method (or even combine the two) to deliver the best response to the user.
Keep in mind that Google’s algorithm is formed by a vast complexity of rules and operations. RankBrain and BERT play a significant role, but they are only parts of this robust search system.
How does Google BERT work?
One of Google’s differentials from other language processing systems is its bidirectional character. But what does that mean?
The other systems are only unidirectional. That is, they only contextualize words using terms that are on their left or their right in the text.
BERT works in both directions: it analyzes the context to the left and right of the word. This brings a much deeper understanding of the relationships between terms and between sentences.
Another differential is that BERT builds a language model with a small text corpus.
While other models use large amounts of data to train machine learning, BERT’s bi-directional approach allows you to train the system more accurately and with much fewer data.
So after the model is trained in a text corpus (like Wikipedia), it goes through a “fine-tuning”.
At this point, BERT is submitted to specific tasks, with inputs and outputs according to what you want it to do. That’s when it starts to adapt to different demands, like questions and answers or sentiment analysis.
Note that BERT is an algorithm that can be used in many applications. So when we talk about Google BERT, we’re talking about its application in the search engine system.
In Google, BERT is used to understand the users’ search intentions and the contents that are indexed by the search engine.
Unlike RankBrain, it does not need to analyze past queries to understand what users mean. BERT understands words, phrases, and entire content just as we do.
But also realize that this NLP model is only one part of the algorithm. Google BERT understands what words mean and how they relate to each other.
But Google still needs all the work of the rest of the algorithm to associate the search to the index pages, choose the best results, and rank them in order of relevance to the user.
Why is Google BERT important for the search experience?
Now, we’ll leave the IT terms aside for a bit to talk about what BERT means to Google searches.
You understand that the algorithm helps Google decipher the human language, but what difference does it make to the user’s search experience?
It is important to remember that Google’s mission is to organize all the content on the web to deliver the best answers to users.
For this, the search engine needs to understand what people are looking for and what web pages are talking about. Thus, it can make the correct match between keywords and web content.
For example, when you search for “food bank”, the searcher understands that the “bank” in your query does not refer to a sitter, a financial institution, or a sandbank in the sea.
If you searched for “food bak” (with misspelling) or “bank food” (in reverse order), it would also understand what you meant.
With BERT, it understands the meaning of that word in your search terms and in the indexed pages’ contents.
When indexing a page with the word “bank”, the algorithm places the food bank, furniture, and banking pages in different boxes.
But the searcher goes further: it also understands the intention behind this search.
By doing this search, Google understands that you are searching for food banks near where you are. So the results page will probably show the institutions that provide this kind of service in your region, especially if they have a good local SEO strategy.
This way, Google becomes more intelligent to deliver results that really provide what users want to find. This is the search experience that Google wants to offer.
However, in Google’s early days, not all searches delivered what the user was looking for. The searcher was limited to the exact match of the keyword.
That is, when the person typed “bromeliad care”, for example, it was only able to provide results for the pages that used precisely this term.
Since RankBrain came out, Google has already started to understand that “care” is very close to “how to care”. So, the search engine would also show pages with the terms “how to take care of bromeliads”.
BERT makes Google understand that the person wants to know how to take care of bromeliads without sticking to the exact keywords.
The problem is that Google’s initial model of exact matching of keywords has created internet vices. To appear in the search engine, many sites started using the keywords in the text exactly as the user would search. However, this makes the reading experience very poor.
Think with us: would you prefer to read content that speaks naturally about taking care of bromeliads or a text that repeated several times “bromeliad care” without it making any sense?
So, Google’s shift to understanding search intentions also improves the user’s reading experience.
Sites are oriented to produce content with a natural language, using terms that make sense to the reader.
With this, Google also combats keyword stuffing, a black hat practice that violates search engine policies. Therefore, the user only benefits!
What are the impacts of BERT on SERPs?
When Google launched BERT, it said that the update would affect about 10% of searches in the United States.
Like every algorithm update, the announcement generated a movement in the SEO market, as many sites feared losing positions.
However, unlike updates that aim to counter bad practices, BERT did not penalize any sites. What it does is improve the alignment between user searches and page content.
Therefore, if someone lost positions for a particular keyword, it means that it did not bring a good answer to that query.
On the other hand, if the page is right for Google, it was probably better aligned to another query and managed to improve the quality of its traffic, making visitors more likely to enjoy the content.
Google showed an example to explain the changes that BERT causes in SERPs. In the image below, you can see how the search would look before and after BERT.
The keyword is “2019 brazil traveler to USA need a visa”. BERT understands the user’s intention to know if Brazil’s travelers need a visa to enter the United States.
Before the update, however, Google understood that the search was for information on U.S. tourist visas to Brazil.
The big difference is in one detail: the word “to”, which indicates the direction of the trip (from Brazil to the USA).
Before BERT, this word would be ignored by bots and would bring incorrect results to the searcher. Now, all the words are analyzed in their context. In this case, the preposition modifies the whole meaning of the phrase.
In BERT’s announcement, Google also said that the update would affect featured snippets, which are the highlighted sections that appear in the SERP’s “zero position”.
Google started to select the most relevant snippets for searches. Therefore, once again, those who lost featured snippets were not penalized — they just didn’t deliver the best prompt answer to what the user searched for. Below you can see another example.
In the search “parking on a hill without curb”, the searcher would put much more emphasis on the words “parking,” “hillside” and “curb” and would ignore the word “without”.
This way, it would bring results explaining how to park on a curb. BERT understands that the user wants to know how to park on a ramp with no curb.
Content and SEO: how to optimize for BERT?
So, in the face of the update announced by Google and the changes in the SERPs, what can you do to improve your SEO results?
Well, the truth is that there’s not much to optimize for BERT.
If you were looking for optimization tricks in this article, maybe this phrase is disappointing. But you have to understand that Google made this update precisely to prevent sites from optimizing pages and content for bots.
The search engine wants to offer content of value to users and wants to count on your site for that.
So do not optimize your site for BERT — optimize for users. That’s why we didn’t bring optimization tips, but we want to reinforce some good content production practices to offer the best experience to your visitor.
Both RankBrain and BERT decree: content should be made for people, not bots! So, forget the exact matching of keywords.
In order to match users’ searches exactly, many people still eliminate auxiliary words (called stopwords, such as “to”, “a”, “from”, “one” etc.), trying to get closer to the terms users use.
This generates super optimized texts for “bike how to choose”, for example, which makes for a strange reading experience at the least.
Another aberration is to optimize texts considering the spelling mistakes that users make. So, instead of writing “lawyer”, as would be correct, the text uses “lawer”, since many people could write this way.
Besides not helping SEO at all, the site also loses credibility!
So write naturally and in good English about how to choose a bike and how to hire a lawyer. Do not worry about stopwords or spelling mistakes.
Remember that Google understands natural language, so you don’t have to (and shouldn’t!) push it to exactly match the users’ search terms.
Optimize for search intentions
Ok, it is understood that exact keywords are no longer the focus of SEO. So, to appear in users’ searches, how should the contents be optimized?
Instead of focusing on keywords, shift the focus to search intentions.
If you used to focus on optimizing what the user searches for, you should now optimize what the user wants to find. Do you see the difference?
The secret is to understand your buyer persona’s intentions, that is, what are the doubts they want to solve, and that your site can answer.
You can see this by conducting keyword and benchmark searches, identifying search trends in your area, and ranking opportunities. From the perception of public demands, it is up to the production team to create high-quality content that responds to them.
Search the semantic relationships between words
Perhaps another doubt has arisen there: if the exact match is no longer suitable for SEO, does the keyword search still make sense?
Of course it does! The keyword search remains a powerful planning tool.
With it, you can understand which searches lead to your site, which terms users are using, and which subjects are on the rise in your field. Thus, it is possible to plan the guidelines to meet these searches.
The difference is that you will no longer over-optimize blog articles with these exact terms. What you can do now is identify the main search terms and look for words that establish semantic relationships with them.
Synonyms, antonyms, slang, and co-occurrences are part of the semantic field of a word. So instead of repeating a keyword several times, you can explore these variations in your text, along with the main terms.
This practice enriches the reading experience and helps Google understand the meaning of your materials.
Produce quality content
This orientation seems obvious, but it is always good to reinforce. Basically, Google wants you to produce quality content for people. Google BERT is one of the main updates in this sense.
So, don’t waste any more time thinking about optimizing for one term or another.
In addition to meeting the search intentions, dedicate yourself to creating original, updated, reliable, and useful content for users. Build content that is worth reading and sharing.
Google advises that high-quality content should have a high level of EAT, that is, expertise, authority, and trust.
So it is these words that should guide your Content Marketing strategy. Google will know how to recognize your work.
Offer the best reading experience
Finally, always think about the reading experience. You know that book that you just can’t put down? Or that article that enriches you with so much good information?
Get inspired by them!
Understand how these contents are built, how they tell stories, and involve the reader. Of course, you’ll have to adapt the format and language for the internet, with scannability features and the use of links and images, for example.
This is what you must do in your texts to engage the audience and make the readers return. In SEO, this engagement sends positive signals to Google, saying that you offer a good experience and deserve to earn ranking points.
Finally, now you know all the details of Google BERT and the impacts that this update brought to the universe of SEO.
You can see that Google is not kidding, right?
The most advanced technologies in artificial intelligence are being employed to improve the search engine’s experience, both on the side of the website and the user. And, of course, the investments won’t stop at BERT. We will be here to follow this evolution with you.
Do you want to improve your digital strategy and bring more visitors to your channels? Then, check out our complete SEO guide and reach top Google results!