Apple is developing Apple Watch models with built-in cameras that will utilize artificial intelligence to provide contextual information about users' surroundings, according to Bloomberg's Mark Gurman.
In his latest Power On newsletter, Gurman revealed that Apple plans to release camera-equipped Apple Watches by 2027, positioning the tech giant to compete in the growing AI wearables market currently dominated by companies like Meta with its smart glasses.
Standard Apple Watch model and Apple Watch Ultra to use cameras differently
The cameras will be implemented differently across Apple's watch lineup. Standard Series models will feature cameras embedded within the display, similar to the front-facing camera on iPhones, while Apple Watch Ultra models will have cameras positioned on the side near the Digital Crown and button, Gurman reported.
These new watches will expand Apple's Visual Intelligence feature beyond smartphones. Visual Intelligence, which debuted with the iPhone 16, allows users to analyze objects and text using AI tools from ChatGPT and Google Search. The feature will soon reach iPhone 15 Pro models with iOS 18.4 next month.
Apple Watch may still not get this iPhone feature
Gurman's report suggests the cameras are not intended for FaceTime calls but rather to enable practical AI features. For example, users could point their watch at a restaurant to instantly receive information about business hours or ratings as they walk by.
Mike Rockwell, who previously led development of Apple's Vision Pro headset, will play a key role in bringing AI features to Apple's wearable devices while continuing to work on visionOS.
According to the Bloomberg report, Apple's strategy includes transitioning Visual Intelligence from relying on third-party AI models to using its own in-house technology by the time these new wearables launch.
Apple is also developing camera-equipped AirPods with similar AI capabilities, further expanding its AI wearables ecosystem, Gurman noted in his newsletter.