Google Launches New AI-Powered Feature for Lens App

AI-powered feature for Lens app
Google has launched a new AI-powered feature for Lens app that allows users to identify objects and text in the real world. The feature, called Live View, uses augmented reality to overlay information about the object or text on the user’s screen.

To use Live View, users simply open the Lens app and point their camera at an object or piece of text. The app will then identify the object or text and overlay information about it on the user’s screen. For example, if the user points their camera at a flower, the app will identify the flower and overlay information about it, such as its name, type, and common uses.

Live View is currently available in English and is compatible with select Android devices. Google plans to make Live View available in more languages and on more devices in the future.

The launch of Live View is a significant development for Google Lens. The app has been available for several years, but it has been limited in its capabilities. With Live View, Google Lens becomes a more powerful tool that can be used for a variety of tasks, such as shopping, learning about new things, and finding directions.

Google is not the only company that is developing augmented reality technology. Apple, Microsoft, and other tech giants are also working on AR projects. However, Google is one of the leaders in the field and its work on Lens is a sign of the company’s commitment to AR.

It remains to be seen how widely adopted Live View will be. However, the feature has the potential to make Google Lens a more popular app and to help drive the adoption of AR technology.

For more Tech News Click here

Share:

Leave a Comment

Your email address will not be published. Required fields are marked *

Follow us:

Recent Post