BERT Has Finally Arrived – Here’s What You Need to Know

Table of Contents

Google open sourced its Bidirectional Encoder Representations from Transformers (BERT) neural network in November 2018. That was the first many of us heard of BERT. Now it’s official. BERT was launched for English language searches at the end of October. It is only a matter of time before it is universal.

Some are saying that BERT is the most significant upgrade to Google search algorithms since the introduction of RankBrain. Others put it in the same league as Panda. Such differing opinions do not matter all that much at the end of the day. What’s most important is understanding what BERT does and how it affects your SEO efforts.

BERT in Simple English

We could get into all the technical aspects of BERT and confuse the heck out of you. We don’t want to do that. So instead, we will keep it as simple as possible and touch only on the SEO aspects. The first thing to understand is exactly what BERT does. Remember one fundamental principle: BERT is essentially a neural network.

A neural network is a system of algorithms capable of scanning data, recognizing patterns, and then acting according to those patterns. This is exactly what BERT does. By the way, RankBrain does the same thing. So what’s different about BERT? It combines neural network principles with natural language processing (NLP).

NLP is a form of artificial intelligence that makes speech recognition possible. It can figure out words based on context. This is what sets BERT apart from its predecessor. The best way to understand this is to think of prepositions.

Words like ‘to’ and ‘for’ can drastically change the meaning of a sentence. Prior to BERT, Google searches paid very little attention to prepositions. As such, search results did not always line up with the intent of the user. BERT uses the principles of natural language processing to account for prepositions and other words that Google would have previously ignored.

It is Bidirectional

BERT’s bidirectional nature is perhaps the most fascinating aspect of this new technology. Being bidirectional means that BERT’s algorithms can analyze tricky words in both directions. Rather than just relying on the word that proceeds the problem word, BERT can look at multiple words on either side. It can then compare all those words against predefined data sets in order to determine intent.

Here is a basic example:

A pre-BERT search for the phrase ‘instructional Spanish DVDs for adults’ would likely return results featuring products intended for all ages. A BERT search recognizes the word ‘for’ to return results only for adults.

The same ability to account for previously ignored words in search phrases also applies to rich snippets. BERT will now analyze rich snippets for the same sort of contextual meaning. This should make snippets more effective for returning targeted results.

Its Impact on Searches

For the time being, BERT is expected to impact one in every 10 English-language searches. That may not seem like a big deal, but it is. Also bear in mind that neural networks and natural language processing both improve over time. They are like a good wine. The larger the data sets they have to work with, the better they do at interpreting user intent.

This suggests that even if your website is not impacted right now, it will be at some point. BERT will eventually be able to determine user intent almost as accurately as speech recognition software. At that point, the content on your site will matter a whole lot more.

How to Respond to BERT

Now the big question: how should I respond to BERT? The answer depends on your current content creation and SEO strategies. From our perspective, it is all about how much effort you are currently putting into your website.

Assuming you already follow the most up-to-date white hat practices, there is not much for you to do. There is little you can do to optimize your site for BERT as long as you already have high quality, SEO optimized content on your site. Basically, keep doing what you’re doing.

If you are not following the latest white hat practices, here’s what you can do to maximize BERT:

  • Start producing high-quality content that speaks directly to the searches your target audience is likely to run.
  • Focus on making both existing and new content longer. BERT can do more with 1,000 words than 500.
  • Make a point of developing long tail keywords and phrases that account for how people actually speak.
  • Develop keywords and phrases more attuned to voice searching.

This final point is one that we cannot emphasize enough: given that BERT can utilize natural language processing, it is imperative that your content be grammatically correct. BERT is going to account for every word written, so improper grammar is bound to negatively impact search results.

To that end, it might be a good idea to turn content creation over to trained professionals. Maximizing BERT is now about high-quality content with flawless grammar, excellent syntax, and a logical flow of thought from start to finish.

Like this article? Please share it