Last month, Google announced big changes coming to Google Images on company’s 20th anniversary and now rolling out a key update.
Cathy Edwards, Director of Engineering for Google Images, announced that Google is moving to provide more immersive visual content; accelerated with the launch of the AMP project (and AMP stories) and visual previews on featured videos in Search, alongside the integration of AI with Lens in Google Images.
The integration of Lens in Search will only be available on mobile and part of other updates to Google Images. Over the last year, Google has shaken up the Images algorithm to “rank results that have both great images and great content” and the page authority of the web page an even more important signal in image ranking.
Google Lens is already a feature in Google Assistant which allows you to explore and learn about the world by pointing your Android phone at almost anything like a building, an animal, or a plant and instantly receive more information using the feature’s ID skills.
Google Lens in Mobile Image Search: How it Works
In a Google blog post, Assaf Broitman, the Product Manager for Google Images, said “We launched Lens to help you do more with what you see. People already love using it in their camera and on their photos–to find items in an outfit they like, learn more about landmarks, and identify that cute dog in the park. Lens is a natural fit for Google Images.”
Essentially, Google Lens in mobile image search allows users to instantly read more information like product details, price, and where to buy an item in the image. For example, you would search for “modern living room furniture” on Google Images and the top results returned for your query will have dots on the image. You can also draw a circle around an image for more information with results from Google Search.
The dots on the image selected in the search result will be clickable and most likely give preference to products listed in Google Shopping. Google says that Lens integrated with Google Images will work on the same objects as it already does through using the feature on Google Assistant, but the only significant change is that you can now buy by clicking through on the image.
Google Lens in image search is currently only available in the US for mobile image searches in English. The feature will soon be rolled out in other countries and languages.
Why Google Lens is Integrating into Image Search
Image search first became a necessary next step for Google on February 24th 2000. It was the day after the Grammy Awards and Jennifer Lopez wore a dazzling green dress that users were frantically searching to have another look. However, the majority of users were looking for images of the dress, not an article they could read with information about the dress.
Mobile has since become integral to Google’s approach to search engine optimization by prioritizing convenient, appealing, and informational content that can be easily accessed on mobile devices. That’s why Google is integrating Lens into image search in order to deliver a seamless buying experience – from discovery to checkout in seconds – not minutes.
Want to boost your company’s online visibility and generate more qualified leads? Then call TechWyse Internet Marketing today at 866.288.60 or contact us here to learn more.
The post Why Google is Integrating Lens with Mobile Image Search appeared first on The TechWyse ‘Rise to the Top’ Internet Marketing Blog.