Google is injecting new technology, BERT, into its search engine to better explain the billions of web queries it handles every day.
This change announced on Friday, October 25, 2019, will transform the world’s largest search engine from spitting keyword results to “closer to language,” says Ben Gómez, the search chief for alphabet inc.’s google. “We are far from fully addressing this issue, but it is a big step forward,” he said at a press conference.
The impact on SEO and the website keyword rankings?
Google uses the Bert model to understand that queries will affect keyword search rankings and featured snippets. However, Bert will not be used for 100% searches for now.
Now BERT will use a tenth of our English search. Google says BERT is so complex that it pushes the boundaries of Google’s hardware, so it’s probably only used for a limited number of searches.
Google has no peers in web search. But improving basic search technology is critical to maintaining the advantage of adjacent areas, mainly voice computing, where Google competes with amazon.com inc.
The new system is based on Google AI tools to analyze long and complex phrases, not just strings. In the test, Google executives said it produced more accurate results.
Over the years, Google has moved away from its 10 pages of linked results in blue. Now it shows more results from its own services for some queries, such as flight information, or extract a block of text from the site to what it called “featured snippets”. The competitor has complained to regulators that Google’s behaviour is anti-competitive.
Google executives say the new system has generated more featured snippets in results outside the United States, insisting that Google won’t pull anyone else’s eye, but improved search results drive more search and overall web traffic.
“If people receive more questions answered, they will ask more questions,” Gomez said. And it will result in more traffic to the whole search ecosystem. He did not share data on these trends.”
The new system will first be applied to English search results and then expanded. Google says it doesn’t immediately affect search ads.
What is BERT?
BERT or “bidirectional encoder representations from transformers” is the technology behind this new neural network. Google first spoke about BERT last year and open-sourced the code for implementation and pre-trained model. They are especially suitable for data that are very important for element sequences, which obviously makes them a useful tool for dealing with natural languages and search queries.
This update from BERT also marks the first time Google uses its latest tension processing unit (TPU) chip to deliver search results.
Looking to the future
Ideally, this means that Google Search can now better understand what you’re looking for and provide more relevant search results and featured snippets. The update started to launch this week, so you’ve probably seen some of its effects on the search results.