Should we be afraid of Google BERT?

Korea Data Forum Fosters Collaboration and Growth
Post Reply
nusaibatara
Posts: 79
Joined: Tue Jan 07, 2025 4:46 am

Should we be afraid of Google BERT?

Post by nusaibatara »

⏲ ​​Reading time: 7 minutes
Google's latest update has SEO experts confused. Google BERT is an improved version of the Google Rankbrain algorithm . Google calls this update "one of the biggest leaps forward in search engine history." But in my opinion, the algorithm's impact is actually minimal.

So what to think?

What is BERT?
The BERT algorithm is an open-source neural network that enables natural language processing. The acronym stands for "Bidirectional Encoder Representations from Transformers" and was introduced by Google last year.

BERT is a technology that allows for word integration and disambiguation through word embedding. This is its main strength, but unfortunately, it is also its biggest weakness, as this technique is computationally intensive and therefore quite expensive to use on a large scale.

Word2vec and other models allow you to pre-calculate word vectors for each sentence and save them in a database so they can be retrieved later. This is not the case with BERT, which requires you to systematically recalculate the vectors.

What is the difference between word embedding (word2vec) and BERT?
Here is an example:

Sentence 1: The lawyer meets his client several times before the trial.
Sentence 2: To choose an avocado well : pay attention to the color and appearance.
If I want the algorithm to understand the meaning of the word “AVOCADO,” which refers to both a profession and a fruit, with word embedding, I simply extract THE pre-calculated vector from the database. With this approach, there is only one vector for the same word.

Conversely, since BERT is a technology that creates "contextualized" vectors, I will have to analyze the two sentences in the BERT network. BERT will generate two very different vectors for the word "lawyer," which appears in two very distinct contexts. This is called lexical disambiguation.

Thanks to its disambiguation capabilities, BERT is at the forefront of many natural language understanding tasks. However, this ability also makes it computationally intensive and therefore difficult to exploit. This is why this type of algorithm is powerful in analyzing questions or rewriting content. However, it does have limitations.

Its computational requirements make its application in search queries limited, at least for now. There is no doubt that improvements will be made in the near future, and processing requirements will eventually be reduced.

How does BERT help Internet users?
The Google BERT update allows users to get better results from long, conversational queries (Google is still focused on optimizing for voice search ). Today, using a structureless keyword string is no longer necessary to be understood by Google.

While the Google BERT update directly affects web users, it's more complicated for content specialists. Google has improved its understanding of context, but that doesn't mean you should start writing thousands of pages targeting the long tail.

This update instead encourages you to continue writing well-organized, rich and comprehensive content.

What are the effects of BERT on SEO?
The impact of this update is not unanimous. Google claims it is the “biggest leap forward in five years,” with 10% of search results affected by the change.

SEO consultants and experts, however, believe the impact will be minimal. They don't predict significant changes because they don't monitor long-tail conversational queries. They mostly monitor shorter phrases and head terms that aren't impacted by BERT's natural language processing capabilities.

BERT belgium phone number data is considered a major update, so it's no surprise that its arrival comes with a host of misinformation. Unfortunately, I've already read statements from SEO experts claiming to specialize in BERT optimization. Following the advice of these misguided experts results in wasted time and money spent pursuing an ill-advised strategy that relies on poor tactics.

Is it possible to optimize for BERT?
Let's be clear, BERT is not a ranking factor . And it's not possible to optimize a page for a factor that doesn't exist. BERT helps in understanding long-tail search queries and, therefore, improves the relevance of results.

BERT models are applied to both organic search results and featured snippets. While it's possible to optimize content for these types of queries, you can't "optimize for BERT."

At SEOQUANTUM, we take BERT into account, but we do not optimize for BERT.

We recommend focusing more on long tail phrases and the user's search intent .

BERT isn't the only reason to prioritize long-tail keywords. A complete, rich page will always appear in at least a few hundred searches. Take a look at:







This article on natural language processing ranks for over 100 keywords. Increasing the use of long-tail search terms, such as “natural language processing with machine learning,” “NPL stands for natural language processing,” and many others, would help improve its ranking.

Now, let's say you decided to create new content, instead of optimizing the current page.

What long tail phrase would you target?

There are many expressions that could be added to form sentences that will “please” BERT:

The best
What is a good
How to find
For Dummies
For beginners
My prediction about Google BERT
Google BERT will affect 20% of searches within a year, double its current percentage.

People often use a string of keywords because they think Google understands this type of query better. However, this doesn't really reflect how they would naturally ask a question. It's not natural.

As more and more people discover Google's ability to understand complex queries, the way they use the search engine will change. They'll move from using a string of keywords to using longer phrases. It's simply human nature.

Summary
Post Reply