Blog

Google Search Getting More ‘Conversational’ with B.E.R.T.

By Greg Sterling, Uberall’s VP of Market Insights

Google is making search more conversational. The company announced on Friday that it’s using a new methodology to understand and rank roughly 10% of search queries. It’s called B.E.R.T, for “Bidirectional Encoder Representations from Transformers.”

BERT is “a neural network-based technique for natural language processing (NLP)” that will help Google better understand relationships between words for a more accurate sense of context and meaning. BERT was open-sourced in 2018 and helps machines understand language more like humans.

For example, BERT can understand prepositions and how they change the meaning of sentences. In its blog post, the company uses the example of the query: “2019 brazil traveler to USA need a visa.”


A screenshot of a cell phone Description automatically generated

In the past Google might not have understood the real meaning of this query and shown results that relate to travel from the U.S. to Brazil (above left). The preposition “to” is critical to the intent of this query. BERT now gets it right (above right) and understands this is someone seeking information on travel from Brazil to the U.S.

Another example Google provides is a query about “math practice books for adults.” With BERT, Google can better understand the word “adult” in the context of this request and deliver a more accurate result.
A screenshot of a cell phone Description automatically generated

Google says BERT will be used for “longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning.” Indeed, more queries are becoming “conversational” as voice input on smartphones and virtual assistants see more adoption. In 2016 Google said, “in the Google app, 20% of searches are now by voice.” Since then, usage of voice and virtual assistants have grown significantly, although Google hasn’t officially updated the number.

Google calls BERT “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.” As indicated, BERT will apply to roughly one in 10 search results and impact rankings and featured snippets alike.

There may be no way to optimize for BERT, although some of the content techniques (FAQs and content pages that apply to common questions) local marketers have been using to optimize for voice search may apply here as well. It’s also possible that BERT could have a disproportionate impact on local results, given that voice queries are predominantly mobile and thus often have a local/near-me bias.

BERT is initially being rolled out in U.S. English but will be equally applied to other languages and regions over time.

About Greg Sterling

Greg Sterling is VP of Market Insights for Uberall. Previously, he was VP of Strategy for LSA and is a contributing editor at Search Engine Land. For the past 20 years he has been conducting research and tracking the impact of digital media on offline consumer behavior.