At the end of October 2019, Google announced it is applying BERT (Bidirectional Encoder Representations from Transformers) to ranking and featured snippets within Search.

BERT is an open-source neural network based method for Natural Language Processing (NLP). In simple terms, it is a very powerful model for better understanding the context of a user’s search query in an effort to improve the relevancy of the results served. Google has stated that this will affect 1 in 10 queries, most notably so for “more natural language/conversation queries”. Consider the word “date”. This word has many different meanings (4 in fact) and without the context that surrounds it, we don’t know the meaning. 

Since the announcement, the search industry kept a close eye on website rankings for organic search in an effort to understand if Google will penalise sites. It’s now evident that there has generally been minimal impact on rankings across the board. Google have also since confirmed that any ranking drops are very unlikely to be related to BERT. Google’s John Mueller stated “It’s not that we would say that suddenly your page is less relevant. But rather with BERT we would try to understand does this question that someone is asking us, does it match this website best. And usually that’s for more complicated questions”. 

So, what can we do to capitalise on BERT as marketeers? As is often the case with the Google algorithm, there is no clear answer. The reality is that BERT wasn’t built to penalize sites, it was built to improve user experience. We should take this opportunity to shift our perspective when it comes to content creation for Search. As consumers become more and more familiar with an ‘assistant’ as a medium for getting answers to an increasingly wider spectrum of questions (such as “date for next world cup” or Google’s example of “2019 brazil traveler to usa need a visa”), our content should take more a human-centric approach. We should focus more on providing content that uses natural language flow and perhaps rely less on methods for content creation that we typically associate with better ranking. Where relevant to a brand, we should pay particular attention to providing useful, relevant and fresh information that will benefit a consumer.

                                BERT in Search: Visa Example

Google’s BERT integration and roll out is still is still in its infancy. As is the case with language modelling, the larger a corpus of natural language it can process to train a model, the better it will become in understanding. It’s likely we will see increased emphasis on providing content that better caters naturally to consumers across a continually growing breadth of queries.

I love talking about natural language processing, so much so I wrote my dissertation on it at university. You can read it here. Talk more with me about it at ned@aip.media.