Google Search may understand your semi-conversational, awkwardly phrased searches better now.
According to a blog post published by the search giant, it has implemented a neural network-based language processing technique to help Search better understand the importance of word sequences.
Bidirectional Encoder Representations from Transformers (BERT) enables Search to understand the context of words in a query through word sequences. In other words, BERT enables Search to better understand prepositions like ‘for’ and ‘to’ when they are important to the meaning of a query.
One example Google shared was the query: “Brazil travellers to USA need a visa.” Before BERT, searchers would get results for Americans looking to travel to Brazil. With BERT, however, Search returns results with information about how to get a U.S. visa as a Brazilian.
Another example would be the search: “Can you get medicine for someone pharmacy.” Pre-BERT, Google Search would show results about getting a prescription filled. Now, Search will put results about filling a prescription for someone else at the top.
However, Search still isn’t perfect. Google admits that BERT still has flaws. For example, asking “What state is south of Nebraska” will surface results for a community called South Nebraska instead of for Kansas. But gaps in understanding like these motivate the Search team to keep improving.
Another problem with BERT will be its limited availability. To start, online English-language searches made in the U.S. will go through BERT. Google plans to expand BERT to more regions and languages over time.
You can learn more about BERT here.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.