Google’s applying BERT models to search
Bidirectional Encoder Representations from Transformers – BERT for short – sounds complicated, full of artificial intelligence and machine learning tech – something most marketers won’t need to understand, and they’re right.
BERT models are all about allowing machines to better understand human speech and question-answering formats. So, for example, if you were to ask Google whether you can pick up someone else’s medicine at a pharmacy, you wouldn’t usually expect to see ads and content about filling prescriptions for yourself – using BERT allows Google to understand the nature of such a query and provide results specific to the query, rather than just the closest matching keywords in there.
As with any major change to Google’s algorithm, this new announcement has webmasters and marketers wondering how their organic search traffic will be affected.
It is important to understand that the changes are currently being tested on 1 in 10 search queries in the USA only. Since BERT models apply mainly to conversational queries – questions and answers, the effects will mostly be seen on long-tail queries and on voice search. Your organic rankings for head-terms with 2 or 3 words in the query is unlikely to be affected in the near term.
Future implications of using BERT models on search are exciting. For starters, this does represent a much better user experience – one where your searches get the right sort of answer without you needing to second guess how the algorithm might interpret your query!
[Free Download] Online Multichannel Marketing Guide for Sports
Our multichannel sports marketing guide outlines the 4 most popular acquisition activities in the sports industry, linking them to the marketing channels most likely to help your brand get that conversion. If you haven’t already done so, you can download it by clicking the button below.