"When people like you or I come to Search, we aren’t always quite sure about the best way to formulate a query. We might not know the right words to use, or how to spell something, because often times, we come to Search looking to learn--we don’t necessarily have the knowledge to begin with." ~ Pandu Nayak, Google Fellow and Vice President
Recently, Google announced that it will be applying BERT models. The breakthrough followed after Google did its research on transformers: Novel Neural Network Architectures for Language Understanding - Leading approaches to language understanding tasks such as language modelling, machine translation and question answering.
BERT (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm designed to perform a better understanding of natural language processing.
Google’s BERT makes use of a transformer an ‘Attention to Detail’ (encoder) mechanism that studies contextual relations between words or sub-words in a text. It is also considered bidirectional because the transformer ‘Attention to Detail’ (decoder) mechanism can read an entire sequence of words at once as opposed to traditional directional modes (left-to-right or right to left): Undoubtfully, a fast and practical model approach to dive deep into word-based context and work smarter. Read Rani Horev’s BERT Explained: State of the art language model for NLP for a more detailed explanation.
In a nutshell, Google’s new BERT update will affect 10% of all searches – one of the most significant Google updates of the last five years.
Applying BERT models to both ranking and featured snippets in search is Google’s most advanced approach yet to better understand natural language queries and search intent; In other words, Google can do a better job of providing more useful information and give better answers on its SERPS (Search Engine Results Pages) through its deeper understanding and interpretation.
What this means for SEO: Sites that optimise for relevant keywords and content that focus on solving the user’s problems or conversational type questions will have a positive SEO impact. Equally, sites that are not optimising with the intent to solve problems or answer questions of their target audience will subsequently have a negative SEO impact.
1. Find long-tail keyword opportunities and optimise for long-tail keyword targets.
2. Focus on conversational content (i.e. who, what, why, when, how?)
3. Focus on entities (Person, Company, Date, etc.) that help solve problems and answers the questions of your target audience.
4. Improve internal linking & update SERPS snippets
Overall, Google’s BERT algorithm updates are better for all types of web users since they are designed to improve search quality.