Google’s new Bert algorithm AI (SEO)Update:
BERT:
BERT, which stands for Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing pre-training.
An NLP API is an interface to an existing natural language processing model.
BERT is a method of pre-training language representations, meaning that we train a general-purpose “language understanding” model on a large text corpus.
Here’s a strange sort of kicker that might fly in the face of your instincts: you don’t actually have to do anything to “respond” or “prepare for” BERT.
How does BERT work?
The breakthrough of BERT is in its ability to train language models based on the entire set of words in a sentence or query (bidirectional training) rather than.
the traditional way of training on the ordered sequence of words (left-to-right or combined left-to-right and right-to-left).
BERT allows the language model to learn word context based on surrounding words
rather than just the word that immediately precedes or follows it.
What other Google products might BERT affect?
Google’s announcement for BERT pertains to Search only, however, there will be some impact on the Assistant as well. When queries conducted on Google Assistant trigger it to provide featured snippets or web results from Search, those results may be influenced by BERT.
Google has told Search Engine Land that BERT isn’t currently being used for ads, but if it does get integrated in the future,
it may help alleviate some of the bad close variants matching that plagues advertisers.