Engadget

How to use Visual Intelligence, Apple's take on Google Lens

The recent iOS 18.2 update brings Apple Intelligence features, including Visual Intelligence, a tool that leverages the camera system and AI to analyze images in real-time and provide useful information. Visual Intelligence is currently only available on the iPhone 16 Pro and Pro Max, but Apple may eventually make it available for older models. To access Visual Intelligence, users must join a waitlist by going to settings and clicking on "Apple Intelligence & Siri" and then "Join Waitlist." Once approved, users can launch Visual Intelligence by long-pressing the Camera Control button. The tool can be used to analyze text, translate it, read it aloud, or summarize it, and it can also identify contact information and provide actions to take. Visual Intelligence can also provide details about a business, including hours of operation, menu, and reviews, but this feature is currently only available to US customers. The tool can also integrate with ChatGPT and Google Image Search, allowing users to ask questions about objects or find similar images online. To use ChatGPT, users can point the camera at an object, activate Visual Intelligence, and tap the ChatGPT icon, and then ask follow-up questions. To use Google Image Search, users can choose the option and view similar photos pulled from the web, which can be useful for finding deals. Overall, Visual Intelligence is a powerful tool that can help users learn more about the world around them and make their lives easier.
favicon
engadget.com
engadget.com
Image for the article: How to use Visual Intelligence, Apple's take on Google Lens