In the engine room that powers its dominant search service, Google recently activated a powerful new tool.
According to the search giant, the new technology – a large-scale AI model known as MUM – could one day transform internet search into a much more sophisticated service, acting as a virtual search assistant as ‘he’s scouring the web for solutions to complex questions.
But company critics warn that this carries a clear risk: It will accelerate a change that has already seen Google provide more direct responses to user queries, putting themselves ahead of other websites to ‘internalize’ search traffic. and keep internet users locked. in a Google universe.
MUM – short for Unified Multitasking Model – is the latest in a series of behind-the-scenes upgrades to the Google search engine that the company says have brought about significant changes in the quality of its results.
These include the introduction ten years ago of a “knowledge graph” which defined the relationship between different concepts, bringing a degree of semantic understanding to research. More recently, Google has sought to apply the latest in deep learning technology to improve search relevance with a tool called RankBrain.
“We believe we are on the next big step,” said Pandu Nayak, Google researcher in charge of MUM.
Google gave the first glimpse of new technology at its annual developer conference in May, although he said little about how the system could be used. In an interview now, Nayak has said that MUM may one day handle many of the “fuzzy information needs” that people have in their day-to-day lives, but which they haven’t yet framed into specific questions they can. to research.
Examples he gives are when parents are wondering how to find a school that is right for their child, or when people first feel the need to start a new fitness program. “They’re trying to figure out, what’s a good fitness routine – one that’s at my level?” he said.
Using today’s search engines, “you really have to convert that into a series of questions that you ask Google to get the information you want,” Nayak said. In the future, he suggests, this cognitive load will be borne by the machine, which will take on what he calls “much more complex and perhaps more realistic user needs.”
He added that eventually, MUM’s applications should extend far beyond research. “We see it as a kind of platform,” he said.
MUM is the latest example of an idea that swept across the field of natural language AI. It uses a technique called transformer, which allows a machine to look at words in context rather than as isolated objects to be matched through massive statistical analysis – a breakthrough that has taken a leap in the “understanding” of the machine. .
The technique was first developed at Google in 2018, but its most spectacular demonstration came with last year’s GPT-3, a system developed by OpenAI that shocked many in the AI world with its ability to generate large blocks of consistent text.
Jordi Ribas, head of engineering and products for Microsoft’s Bing search engine, said it had sparked a “race through all high-tech companies to come out with bigger models that better represent language.”
When Microsoft unveiled its Turing language generation model early last year, it claimed it was the largest such system ever built. But GPT-3, unveiled months later, was ten times the size. Google hasn’t released technical details for MUM, but said it was “1,000 times more powerful” than BERT, its first experimental model using transformers.
However, even with this huge leap forward, Google faces a daunting challenge. Search companies have dreamed of answering complex questions for the past 15 years, but found the problem much more difficult than expected, said Sridhar Ramaswamy, former head of Google’s advertising business and today CEO of the research start-up Neeva.
“There is so much variation in anything complicated that we do,” said Ramaswamy. “Trying to get the software to understand these variations and guide us has proven incredibly difficult in practice.”
the first uses MUMs involve behind-the-scenes research tasks such as ranking results, classifying information or extracting answers from text.
The difficulty of objectively measuring the quality of search results makes it difficult to judge the impact of efforts like this, and many experts wonder if other new search technologies have lived up to the hype. Greg Sterling, a seasoned research analyst, said many search users won’t notice much improvement and research for products in particular remains very frustrating.
Research companies, for their part, say internal testing shows that users prefer the results of the most advanced technologies. The ability to extract responses from text has already enabled Bing to offer direct responses to 20% of the queries it receives, according to Ribas.
For most people, the impact of transformers is only likely to be felt if the technology drives more visible changes. For example, Google says MUM’s ability to understand both text and images – with video and audio to be added later – could lead to new ways of searching across different types of media.
#techFT brings you news, commentary and analysis on the big companies, technologies and issues shaping this fastest moving industry by specialists around the world. Click here to receive #techFT in your inbox.
Dealing with the more “fuzzy” queries that Nayak has in mind would result in Google gleaning information from a number of different places on the web to present a much more precise answer to each very particular query.
“This consolidates all activity on Google properties,” said Sara Watson, senior analyst at market research group Insider Intelligence. “Everything that appears on this first page [of search results] can be whatever you want. Such a system could cause backlash from web publishers, Watson added.
Google, already watched by regulators around the world, denies that it intends to use MUM to keep more web traffic to itself. “This is not going to become a question answering [system]Nayak insisted. “The content available on the web is rich enough that giving short answers doesn’t make sense. “
He also denied that distilling the results of multiple searches into one result would reduce the amount of traffic Google sends to other websites.
“The better you understand user intent and present users with the information they really want, the more people come back to look for,” he said. The effect will be to “make the pie fat” for everyone.
Advertising on the Search Network, the lifeblood of Google’s business, could face similar questions. Reducing the number of searches required to answer a user’s question can reduce the ad inventory that Google can sell. But, said Watson, “if the query can be more complex and focused, so can the ad. This makes [ads] much higher value and potentially changes the pricing model.
Google’s main search advances over the years
Universal search – 2007
Google goes beyond displaying “ten blue links” to return images and other results
Featured Excerpts – 2009
Short text results start to appear in an area at the top of the results page, angering some editors
Voice search – 2011
Users can talk to Google for the first time
Knowledge Graph – 2012
Google creates a network of connections between different ideas, producing direct factual answers to queries
RangBrain – 2015
Applies the latest advances in neural network AI to make research results more relevant
MOTHER – 2021
Bring a deeper level of understanding to many research tasks, promising useful answers to complex queries