Blog | Equinet Media

BERT explained for marketers. Should Google's latest update worry you?

Written by Osian Barnes | 1, November, 2019

Widely touted as the biggest change to the Google algorithm in years, the BERT update has been the subject of much media attention. But what is it and should content creators be worried about it? 

What is the BERT update?

BERT stands for Bidirectional Encoder Representations from Transformers, of course. It’s a technique for NLP (Natural Language Processing) pre-training, developed by engineers at Google that’s now being harnessed to improve the quality and precision of our search engine returns.  

Sadly, despite all the pictures of Bert and Ernie adorning the top of SEO articles - it has nothing to do with Sesame Street.

Why should we worry about BERT?

As Google themselves have stated that 1 in 10 searches in English in the US will be affected by the release of BERT ‘into the wild’, it’s unsurprising that brands, marketers and content creators are all concerned about its impact on ranking, traffic and their snippets.

In fact, on its release at the end of October several big media hitters (including the NYT) reported dips in their traffic, which were initially ascribed to BERT. These scare stories were later retracted, however, and the losses attributed to other elements of the mysterious algorithmic functioning of the search giant.

So, what does BERT really do?

Cracking the problem of the intentionality in search has been one of Google’s major preoccupations in recent years, intensified and assisted by the increase in voice search and the data gleaned from it. And this work has come to fruition with the advent of BERT.

According to Pandu Nayak, Google’s Vice President of Search, instead of simply looking at longer search queries on a word by word basis (presumably prioritising certain returns based on their placement in a sentence and other variables), BERT now allows Google to parse search phrases in a much more human way. Notably, this includes taking into account linguistic modifiers to drill down on intention.

“At Google’s core, Search is understanding the language we use. And by applying BERT models to both rankings and featured snippets in search, we’re able to understand and do a much better job helping everyone find useful information in their search results."

Pandu Nayak, Google’s Vice President of Search

Now, it might say quite a lot about the secretive, smoke and mirrors world of Google’s search function that I was under the impression they already did this kind of parsing. But apparently, not.

For the first time, it turns out, prepositions and other modifiers, will be factored into search analysis. And Google has supplied a few (already famous) examples of this in action.

BERT in action

Take the long search term

“2019 brazil traveler to USA need a visa’

Previously, they tell us, Google would have interpreted this as a request for information about US travel to Brazil. Now, however, taking into account the distinct use of the preposition in the search, Google now correctly infers that the searcher is looking for information about whether Brazilians traveling to the US will need a VISA.

Another example is the way Google will now parse:

"Can you get medicine for someone pharmacy"

The old algorithm would have just picked out “medicine” and “pharmacy” from the query and returned local results for nearby pharmacies. With BERT, the algorithm will now zero in the phrase on “for someone” and work out that you are looking for information about picking up a prescription for someone else.

What does BERT mean for your traffic?

Maybe not a huge amount. Apart from the fact it is, right now, only rolled out for English language searches conducted in the US - it will mainly impact on the accuracy and relevance of long tail search results.

But it does shows in clear and stark relief, the rapid direction of travel for Google search. It demonstrates they are expecting more and more search to be conducted in an increasingly conversational style (as voice becomes more central to our digital experience).

What does BERT tell us?

Above all, BERT reminds us of the scale of the company’s ambition. That it intends on making interacting with their search engine as much as possible like talking to a human being, delivering results that are able to pick up on subtle linguistic nuance and intentionality, whilst drawing inference from context in myriad ways.

So, the message is this, we can’t really optimise for BERT and nor should we want to. Instead, we should be reassured that Google in the future will be able to see great content for what it is. That there will be no need to artificially hack or ‘optimise’ content for ranking and snippets.

Which is great news for content writers. The more precisely we understand our ideal audience and their concerns, therefore, and the higher quality we make the content we intend for them, the more likely it is Google will direct them to us.

Absolutely nothing to do with Sesame Street, then?

OK, you pushed me.  How about this?

Think about Bert and Ernie.

Ernie is the Everyman figure, always curious, always looking for information, continually searching (however aimlessly and obtusely) for the gratification of knowledge. Bert, on the other hand, is the pedant - always wanting to clarify, to refine the argument, to reach the right conclusion in the quickest possible way.

This latest Google update should satisfy him. 

So, all that’s left is for us to worry about Ernie.  Producing the content that can capture his wayward imagination, answering his questions in the most arresting and compelling way possible.

Perhaps Google is dealing with BERT, meaning we can concentrate more on Ernie.