Google held its second annual Search On event on Wednesday, announcing a host of new features and enhancements for Google Lens and Search. With these improvements, Google is giving Search and Lens even more context to help users find what they’re looking for and help them stay informed about their results.
Some of these features are available today, while many will roll out in the coming weeks or months.
Search’s bigger, bolder redesign
The Google Search page is getting redesigned to make it easy to find exactly what you’re looking for. The redesign gives users a more visual experience with images and videos straight from the results page. Searches can also be better refined or broadened to better help users find what they’re looking for.
In the coming months, Google will make it easy to jump into a topic with a new “Things to know” section powered by its Multitask Unified Model (MUM).
Google is also tackling misinformation with improvements to “About this search,” providing additional information about sources, including what others have said about the source.
Additionally, the “About the topic” section will provide users with additional coverage and results about the result from other sources, providing additional context about the topic.
People don’t just come to Google looking for quick facts. They often really want to explore the information that’s out there, and learn about where it’s coming from.
These new features should roll out in the U.S. in the coming weeks and will soon expand to other countries.
Also, in the coming weeks, Google is bringing additional insights to videos in search results, making it even easier to fall down the rabbit hole. When diving into a YouTube video from Search, users will notice a section underneath with related topics based on the video being watched.
Google Lens in more places, with more context
Google Lens is also getting a few upgrades. Soon, users will be able to use Lens straight within the Google app to identify different products and allow you to search and shop for each individual item, such as a lamp or a shirt. This experience will initially roll out to iOS users, but Google says Lens will also roll out to the best Chromebooks, allowing users to view results within the same tab.
Google is also applying its MUM in Lens to give it a more contextual understanding. For example, if you like a pattern on a shirt, you can ask Lens to find socks with the same pattern. Or you could even ask Lens how to fix a particular part of a bike by snapping a picture of it, even if you don’t know what that part is called.
This new Lens feature should also arrive in the coming months as Google tests and evaluates its uses and impact.