Google describes the future of its search engine

In the engine room that powers its dominant search service, Google recently activated a powerful new tool.

According to the search giant, the new technology – a large-scale AI model known as MUM – could one day transform internet search into a much more sophisticated service, acting as a virtual search assistant as ‘he sifts the web to find solutions to complex questions.

But the company’s critics warn it carries a clear risk: it will accelerate a change that has already seen Google provide more direct answers to user queries, putting itself ahead of other websites to “internalize” search traffic. and keep netizens locked out. in a Google universe.

MUM – short for Unified Multitasking Model – is the latest in a series of behind-the-scenes Google search engine upgrades that the company says have brought significant changes to the quality of its results.

These include the introduction a decade ago of a “knowledge graph” that defined the relationship between different concepts, bringing a degree of semantic understanding to research. More recently, Google has sought to apply the latest deep learning technology to improve search relevance with a tool called RankBrain.

“We think we’re on the next big step,” said Pandu Nayak, the Google researcher in charge of MUM.

Google gave the first glimpse new technology at its annual developer conference in May, although it didn’t say much about how the system might be used. In an interview now, Nayak said that MUM may one day handle many of the “fuzzy information needs” that people have in their daily lives, but have not yet formulated into specific questions that they can to research.

Examples he gives are when parents wonder how to find a school that’s right for their child, or when people first feel the need to start a new fitness regimen. “They’re trying to figure out, what’s a good fitness routine – one that’s at my level?” he said.

Using search engines today, “you actually have to translate that into a series of questions that you ask Google to get the information you want,” Nayak said. In the future, he suggests, this cognitive load will be carried by the machine, which will take on what he calls “much more complex and perhaps more realistic user needs.”

He added that eventually the applications of MUM will likely extend far beyond research. “We see it as a kind of platform,” he said.

An illustration of how MUM can handle more fuzzy queries © Google

MUM is the latest example of an idea that has swept the field of natural language AI. It uses a technique called transformer, which allows a machine to look at words in context rather than as isolated objects to be matched by massive statistical analysis – a breakthrough that has made machine “understanding” a leap forward.

The technique was first developed at Google in 2018, but its most dramatic demonstration came with last year’s GPT-3, a system developed by OpenAI that shocked many in the IT world. AI with its ability to generate large blocks of coherent text.

Jordi Ribas, engineering and product manager for Microsoft’s Bing search engine, said it sparked a “race among all tech companies to come up with bigger models that better represent the language”.

When Microsoft unveiled its Turing language generation model early last year, it claimed it was the largest such system ever built. But GPT-3, unveiled months later, was ten times larger. Google did not release technical details for MUM, but said it is “1,000 times more powerful” than BERT, its first experimental model using transformers.

Even with this huge leap forward, Google faces a daunting challenge. Research companies have dreamed of answering complex questions for the past 15 years, but have found the problem much more difficult than expected, said Sridhar Ramaswamy, former head of Google’s advertising business and now CEO of the startup. Neeva research.

“There’s so much variation in everything complicated that we do,” Ramaswamy said. “Trying to get the software to understand these variations and guide us has proven incredibly difficult in practice.”

the first uses of MUM involve behind-the-scenes research tasks such as ranking results, classifying information, or extracting answers from text.

The difficulty of objectively measuring the quality of search results makes it difficult to judge the impact of efforts like this, and many experts wonder whether other new search technologies have lived up to the hype. Greg Sterling, a veteran search analyst, says many search users won’t have noticed much improvement, and product searches in particular remain very frustrating.

The research companies, for their part, say that internal tests show that users prefer results from the most advanced technologies. The ability to extract answers from text has already enabled Bing to offer direct answers to 20% of the queries it receives, according to Ribas.

For most people, the impact of Transformers will likely only be felt if the technology drives more visible change. For example, Google says MUM’s ability to understand both text and images – with video and audio to be added later – could lead to new ways to search across different types of media.

Daily newsletter

#techFT brings you news, commentary and analysis on the big companies, technologies and issues shaping this industry’s fastest movement from specialists based around the world. Click here to get #techFT in your inbox.

Handling the more “fuzzy” queries that Nayak has in mind would result in Google gleaning information from a number of different places on the web to present a much more precise answer to each very particular query.

“This consolidates all activity on Google properties,” said Sara Watson, senior analyst at market research group Insider Intelligence. “Everything that appears on this first page [of search results] can be anything you want. Such a system could provoke a backlash from web publishers, Watson added.

Google, already scrutinized by regulators around the world, denies that it intends to use MUM to keep more web traffic to itself. “It won’t become a question to answer [system]Nayak insisted. “The content available on the web is rich enough that giving short answers makes no sense.”

He also denied that distilling multiple search results into a single result reduces the amount of traffic Google sends to other websites.

“The better you get at understanding user intent and presenting users with the information they actually want, the more people come back to search,” he said. The effect will be to “make the cake bigger” for everyone.

Search advertising, Google’s business engine, could face similar issues. Reducing the number of searches needed to answer a user’s question can reduce the ad inventory that Google can sell. But, Watson said, “if the query can be more complex and targeted, so can the ad. Who does [ads] much higher value and potentially changes the pricing model.

Google’s Top Search Advancements Over the Years


Universal Search – 2007

Google goes beyond showing ‘ten blue links’ to returning images and other results

Short text results start appearing in a box at the top of the results page, angering some publishers

Voice search – 2011

Users can talk to Google for the first time

Knowledge Graph – 2012

Google builds a network of connections between different ideas, producing direct factual answers to queries

RankBrain – 2015

Applies the latest advances in AI from neural networks to make search results more relevant

MOM – 2021

Brings a deeper level of understanding to many research tasks, promising useful answers to complex queries