Google is making it even easier to search the world around you. The tech giant has rolled out new updates for Google Lens, expanding its capabilities across devices and apps. iPhone users are getting a more seamless visual search experience, and AI Overviews are now making Lens smarter than ever.
Google Lens Arrives in More Places on iPhone
For iPhone users, visual searches just got a major upgrade. Google has added new Lens features to both Chrome and the Google app, making it possible to search for objects, text, or images on your screen without extra steps.
With the latest update, you can highlight, draw around, or tap on anything you see on your screen and let Google Lens do the work. Whether you’re shopping, reading an article, or watching a video, the feature eliminates the need to take a screenshot or open another tab to find more information.
Here’s how you can access it:
- In Chrome: Tap the three-dot menu and select “Search Screen with Google Lens.” In the coming months, Google will add a dedicated Lens icon in the address bar, similar to how it works on desktop.
- In the Google app: Open the three-dot menu, choose “Search this Screen,” and then pick what you want to search.
These updates bring Lens closer to being a built-in, one-step visual search tool rather than just a separate app feature.
AI Overviews Bring More Context to Lens Results
Google isn’t just making Lens more accessible—it’s making it smarter. The company is upgrading Lens with advanced AI models that can analyze images beyond its massive database of billions of objects.
Previously, Lens worked by recognizing known items like plants, landmarks, and products. But now, it’s capable of handling unique or unfamiliar images.
For example, if you snap a photo of an odd texture on a car hood, Lens will not only identify it but also generate an AI Overview—a summary with useful information and relevant web links. This means users no longer have to type out a specific question to get meaningful results.
- AI Overviews will help Lens provide detailed explanations instead of just linking to existing webpages.
- The feature is designed to bridge knowledge gaps by analyzing visual inputs that don’t have straightforward keyword matches.
- It enhances the search experience by reducing the need for manual input, making results faster and more intuitive.
This upgrade signals Google’s broader push toward making AI-powered search more context-aware and user-friendly.
When and Where Are These Features Rolling Out?
Google is launching these updates globally, but availability varies by feature:
Feature | Availability |
---|---|
Lens in Chrome & Google app on iPhone | Rolling out worldwide this week |
AI Overviews in Lens | English-language users in supported countries, first in the Google app for Android & iOS, with Chrome support on desktop and mobile coming soon |
As Google continues refining its AI-powered search tools, these updates represent another step toward a more seamless and intuitive search experience. Lens is no longer just a tool to identify common objects—it’s evolving into a full-fledged AI assistant capable of understanding and explaining complex visuals.