BERT is a Major Google Update According to Google this update will affect complicated search queries that depend on context.
This is what Google said:
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results. Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
What is the BERT Algorithm?
Search algorithm patent expert Bill Slawski (@bill_slawski of @GoFishDigital) described BERT like this: “Bert is a natural language processing pre-training approach that can be used on a large body of text. It handles tasks such as entity recognition, part of speech tagging, and question-answering among other natural language processes. Bert helps Google understand natural language text from the Web. Google has open sourced this technology, and others have created variations of BERT.” The BERT algorithm (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm related to natural language processing. It helps a machine to understand what words in a sentence mean, but with all the nuances of context.
BERT And On Page SEO
I asked search algorithm expert Dawn Anderson what that meant for SEOs and she responded that it won’t help websites that are poorly written.
According to Dawn:
“BERT and family improve the state of the art on 11 natural language processing tasks. Even beating human understanding since linguists will argue for hours over the part of speech a single word is. But what if the focus of a page is very weak? Even humans sometimes will be like “what’s your point?” when we hear something. And pronouns have been very problematic historically but BERT helps with this quite a bit. Context is improved because of the bi-directional nature of BERT. There will still be lots of work for us to do since we need to emphasise importance, utilise clear structures, help to turn unstructured data into semi structured data, utilise cues on content light pages (e.g. image heavy but not text heavy eCommerce pages) using such things as internal linking.”
BERT Improves Search Query Understanding
Google’s BERT Update improves how Google understands search queries. BERT analyzes search queries, not web pages. However, as Dawn said, on page SEO becomes more important in terms of using words in precise ways. Sloppy content may not be helped by the Google BERT update.
Dawn Anderson observed:
“It’s knocking human understanding out of the water in loads of natural language understanding tasks. BERT is like a WordPress plugin which is a starting point and then they customise it and improve it. The word “rose” means several things but it’s exactly the same word. The context must accompany the word otherwise the word means nothing.”
An Example of Context and BERT
The phrase was “how to catch a cow fishing?” In New England, the word “cow” in the context of fishing means a large striped bass. A striped bass is a popular saltwater game fish that millions of anglers fish for on the Atlantic coast.