Google offers a new example of how its A.I. research is improving search results

FAN Editor

Google senior fellow Jeff Dean speaks at a 2017 event in China.

Source: Chris Wong | Google

Google says it’s reached a new milestone in understanding search queries.

The company says it will have the ability to consider the full context of a searched word by assessing the words that come before and after. The update is a result of a new artificial intelligence training system it calls BERT, which stands for Bidirectional Encoder Representations from Transformers.

The announcement comes as the company goes after more predictive search results while it tries to make its products more conversational and useful, which would keep people engaged longer.

Although search improvements may seem dull compared with more experimental projects that use AI, like self-driving cars, search is still Google’s core business. The company has market share of more than 90% in many markets, and search is a critical driver of the advertising that makes up more than 80% of Google parent company Alphabet’s total revenues. So a seemingly incremental improvement in search can lead to immediate material increases in user engagement and revenue.

BERT models can consider the full context of a word by looking at the words that come before and after it, which can help the company understand the “intent” of a search, Google Senior Fellow and SVP Jeff Dean said at a press event on Thursday.

Dean gave an example of a search for “2019 brazil traveler to usa need a visa”. In that case, the word “to,” and its relationship to its surrounding words in the phrase, are key to understanding the full meaning: It’s about a Brazilian traveling to the U.S., instead of the other way around. People naturally understand the context, but it’s taken a long time for software to develop the same kind of understanding.

The company says BERT will improve one in 10 searches in the U.S. in English for now and eventually include more languages and locations in the future.

“We’re far from perfecting language, but this is a significant step,” said Dean. “We’re generally looking at a lot of places we could apply it,” he added, referring to BERT. “Understanding language is core to a lot of Google products such as Gmail.”

BERT also marks the first time Google is driving search results by using its own Tensor Processing Units (TPUs), specialized chips created for AI applications, the company said. TPUs are typically used by applications that use artificial intelligence to do things like recognize words people are saying in audio recordings, spot objects in photos and videos, and pick up underlying emotions in written text. Google also makes access to the chips available to third-party developers as a cloud service.

Alphabet reports Q3 2019 earnings on Monday. 

Follow @CNBCtech on Twitter for the latest tech industry news.

Free America Network Articles

Leave a Reply

Next Post

Chinese yuan could head to 7.2 versus dollar in November, Goldman says

Banknotes of Renminbi arranged for photography on July 3 2018 in Hong Kong. S3studio | Getty Images News | Getty Images The Chinese yuan is expected to show some near-term weakness against the U.S. dollar, according to investment bank Goldman Sachs. On Friday afternoon, the onshore yuan changed hands at 7.0663 […]