AI Daily - Artificial Intelligence News

View Original

Google Claims Web Search to be made 10% Better by AI

Google has updated its search algorithms to tap into an AI language model that's better at understanding netizens' inquiries than previous frameworks.

Pandu Nayak, a Google fellow and vice president of search, announced that Google has rolled out BERT, short for Bidirectional Encoder Representations from Transformers, for its most fundamental product: Google Search.

It is Google’s neural network-based technique for natural language processing (NLP) pre-training. Google said BERT helps better understand the nuances and context of words in searches and better match those queries with more relevant results.

To do this, analysts at Google AI built a neural network known as a transformer. The architecture is suited to deal with sequences in data, making them ideal for managing with dialect. To understand a sentence, you must look at all the words in a specific arrangement. Previous transformer models consider words in one direction -left to right – whereas BERT does it differently.

The example below shows what the previous Google Search and new BERT-powered search looks like when you query: “2019 brazil traveler to usa need a visa.”

Note: This example is just for illustrative purposes and may not work in the live search results. Left: Current research result for the query still thinks the sentence means a US traveler is going to Brazil. Right: Search results using BERT correctly identifies the search is for a Brazilian traveler going to the US. Image credit: Google

In this example, Google said, with a search for “2019 brazil traveler to usa need a visa,” the word “to” and its relationship to the other words in a query are more important for understanding the meaning. BERT can now understand the significance behind the word “to” in the new search. Previously, Google wouldn’t understand the importance of this connection and would return the results about US citizens travelling to Brazil.

Another example is Google showing a more relevant featured snippet for the query “Parking on a hill with no curb”. In the past, a query like this would confuse Google’s systems.

Note: This example is just for illustrative purposes and may not work in the live search results. Image credit: Google

Nayak claimed BERT would improve 10 per cent of all its searches. The substantial changes will be for longer queries for those sentences which are scattered with prepositions like “for” or “to.”

As of present, BERT will work best for any inquiries made in English. Google said it too works in two dozen countries for other dialects such as Hindu, Korean and Portuguese for “features snippets” of text.