Google launched two new search features on Wednesday that leverage online images or photos you take in a store, as part of the company’s effort to extend far beyond the text you type into a field. of research.
A feature announced at the Google I/O Conference, scene explorer, lets you scan your phone’s camera over a product shelf in a supermarket or pharmacy to recognize products in view. Google then overlays product information and ratings onscreen so you can find nut-free and fragrance-free lotion snacks, said research manager Prabhakar Raghavan. It is an extension of the Google Lens application.
“Scene explorer is a powerful ability in our devices’ ability to understand the world the way we do, to see relevant information overlaid in the context of the world all around us,” Raghavan said. “This is like having a supercharged Control-F [find shortcut] for the world around you.”
Google said in a blog post that it plans to add Scene Explorer to its search tools, but it didn’t say when that would happen.
Another feature extends Google’s multimodal search, which combines text and images into a single search query. Now, by adding “nearby” text, you can tailor searches to nearby results. For example, you can combine a photo of an unfamiliar dish with the words “near me” to find a nearby restaurant that will serve it, Raghavan said. This feature will arrive later this year for English speakers.
Search, the first service offered by Google and the one that propelled it into today’s multi-product juggernaut. It remains a core part of the company’s mission to make information useful to people around the world, and search ads are still Google’s main source of revenue.
At Google I/O, Google also said it was adopted a new 10-tone monk skin tone scale to better improve the diversity of its AI training data, research results and other operations.