• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

Google introduces a new way to search that combines images and text into one query

September 29, 2021 by David Barret Leave a Comment

Earlier this year, at Google’s I/O annual developer conference, the company introduced a new A.I. milestone called Multitask Unified Model, or MUM. This technology can simultaneously understand information across a wide range of formats, including text, images, and videos, and draw insights and connections between topics, concepts, and ideas. Today, Google announced one of the ways it’s planning to put MUM to work in its own products with an update to its Google Lens visual search.

Google Lens is the company’s image recognition technology which lets you use the phone’s camera to perform a variety of tasks, like real-time translation, identifying plants and animals, copying and pasting from photos, finding items similar items to what’s in the camera’s viewfinder, getting help with math problems, and much more.

Soon, Google says it will leverage MUM’s capabilities to upgrade Google Lens with the ability to add text to visual searches in order to allow users to ask questions about what they see.

In practice, this is how such a feature could work. You could pull up a photo of a shirt you like in Google Search, then tap on the Lens icon and ask Google to find you the same pattern — but on a pair of socks. By typing in something like “socks with this pattern,” you could direct Google to find relevant queries in a way that may have been more difficult to do if you had only used text input alone.

Image Credits: Google

This could be particularly useful for the type of queries that Google today struggles with — where there’s a visual component to what you’re looking for that is either hard to describe using words alone or that could be described in different ways. By combining the image and the words into one query, Google may have a better shot at delivering relevant search results.

In another example, a part of your bike has been broken and you need to search on Google for repair tips. However, you don’t know what the piece is called. Instead of delving into repair manuals, you could point Google Lens at the broken part of your bike, then type in “how to fix.” This could connect you directly with the exact moment in a video that could help.

Image Credits: Google

The company sees these A.I.-driven initiatives as ways to make its products “more helpful” to end-users by enabling new ways to search. By making use of the phone’s camera as part of Search, Google is aiming to stay relevant in a market where many of its core use cases are starting to shift to other properties. For instance, many shopping searches today now start directly on Amazon. And when iPhone users need to do something specific on their phone, they often just turn to Siri, Spotlight, the App Store, or a native app to get help. And Apple is developing its own alternative to Google Search as well. You could see the beginnings of this work in the iOS 15 update to Spotlight search, which now directly connects users to the information they need without the need for a Google query.

Google says it’s also putting MUM to work in other ways across Google Search and video searches, the company announced at its Search On live event today.

The Google Lens update will roll out in the months ahead, noting that it still needs to go through “rigorous testing and evaluation,” which is a part of every new AI model that its deploys.

Source Link Google introduces a new way to search that combines images and text into one query

David Barret
David Barret

Related posts:

  1. UK government plans new pet abduction offence after rise in thefts
  2. North and South Korea conduct missile tests as arms race heats up
  3. No One Pushed ‘Button’ to Prevent Biden from Speaking
  4. Commission chief tells Albania: your future is in the EU

Filed Under: News

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

  • A “Good Death”: How Do Doctors Want To Die?
  • People Are Throwing Baby Puffins Off Cliffs In Iceland Again – But Why?
  • Yet Another Ancient Human Skull Turns Out To Be Denisovan
  • Gen Z Might Not Be On Course For A Midlife Crisis – Good News, Right? Wrong
  • Glowing Plants, Punk Ankylosaur, And Has The Wow! Signal Been Solved?
  • Pulsar Fleeing A Supernova Spotted Where Neither Of Them Should Be
  • 20 Years After Hurricane Katrina: Is It Time For A New Approach To Hurricane Classification?
  • Dog Named Scribble Replicates Quantum Factorization Records – So We Tried It Too
  • How Old Is The Solar System? (And How Can We Tell?)
  • Next Week, A Record-Breaking Over 7 Billion People Will See The Total Lunar Eclipse
  • The Goblin Shark Has The Fastest Jaws In The Ocean, Firing Like A Slingshot At Speeds Of 3.1-Meters-Per-Second
  • We Thought Geological Boundaries Were Random. Now, A New Study Has Identified Hidden Patterns
  • Do Fish Sleep?
  • The Biblical Flood Myth That Inspired Noah’s Ark Had A Sinister Twist
  • Massive Review Of 19 Autism Therapies Finds No Strong Evidence And Lack Of Safety Data
  • Giant City-Swallowing Cracks In Earth’s Surface Are A “New Geo-Hydrological Hazard”
  • Three Incredible Telescopes Looked At The Butterfly Nebula To Learn Where Earth Came From
  • The Pacific Ocean Is So Vast It Contains Its Own Antipodes
  • World’s Tallest Bridge Over “Crack In The Earth” Gets Daunting Load Test By Fleet Of 96 Trucks
  • Mars’s Interior Still Has Evidence Of Ancient Impact, Dead NASA Mission Tells Us
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version