Google recently hosted its Search On live event, and there was news on the company’s main product – namely, the search engine – announce.
It turns out that Google has big plans for the further development of its reliable search tool, and the next evolutionary step is to make visual search more advanced using artificial intelligence.
Allows you to ask questions about photos
In the coming months, the company will be doing research by asking questions about what’s in the images.
For example, if you take a picture of a dress with a certain style that you feel, you can look for other types of clothing in the same style. This way, you don’t have to describe what you see with words, which is a very demanding task, as Google points out.
As you know, visual searches are already possible with the help of the Google Lens tool, which allows searching on Google from the images taken with the cell phone camera. New here, however, is the ability to ask questions about what’s shown in the photo.
The new function is integrated with the lens function and is activated by clicking on the lens icon while the current image is displayed on the screen.
Another example of the application is repair. Instead of searching for the relevant parts to be repaired, you can use the feature to simply take a photo of the damaged part and ask Google how to fix it, which takes the user directly to relevant DIY videos or the like.
“The unified multitasking model”
The new technology is based on something Google calls it Multitasking Unified Model (MUM), which the company presented at the I / O conference earlier this year. MUM is being touted as an important step in the work of using artificial intelligence to understand information much more advanced than before and to perform more complex tasks using Google’s search technology.
MUM technology will also be integrated directly into the search engine in the form of a new feature that Google calls “Things to Know” – a kind of menu that gives users different types of additional information about what they are looking for, based on how users explore topics. So the intention is to get quick access to the most relevant and useful information first.
This feature will also arrive in the coming months, with no specific time available.
Finally, MUM will also have an impact on videos. Thanks to this technology, Google will now be able to identify topics related to what is shown in the videos, in order to provide the user with relevant links. According to Google, MUM will even be able to serve user-related topics that are not necessarily explicitly mentioned in the videos, based on a deeper understanding of the content.
More information on Google News can be found in the company Official blog.