2 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Tim Cook recently praised visual intelligence, an AI feature on the iPhone 16 that enhances camera functionality. This feature allows users to translate text, save events, and access information from screenshots. Rumors suggest it will expand to new devices like AirPods Pro 3 and Apple Glasses later this year.
If you do, here's more
Tim Cook highlighted visual intelligence as one of Apple’s standout AI features during a recent earnings call. This capability, first introduced with the iPhone 16, allows users to interact with their camera in new ways by translating text, adding calendar events from flyers, and accessing reviews through the Camera Control button. With the release of iOS 26, visual intelligence has expanded beyond the camera, enabling users to take advantage of AI features through screenshots.
Rumors suggest that Apple aims to integrate visual intelligence into new products like the upcoming AirPods Pro 3 and Apple Glasses. Both devices are expected to include built-in cameras that will leverage this AI feature, turning everyday interactions into more informative experiences. Mark Gurman noted that Apple plans to embed visual intelligence at the core of these devices, enhancing how users engage with their environment and access information.
Cook's emphasis on visual intelligence likely reflects Apple’s strategy to make this feature a key selling point as it expands to additional platforms. The potential applications for visual intelligence in wearables could significantly enhance user experience, making it a focal point for Apple’s future innovations.
Questions about this article
No questions yet.