Apple bets big on visual AI: Cameras coming to AirPods and beyond

Despite a rocky start with Apple Intelligence, Apple is reportedly shifting its artificial intelligence focus towards "Visual Intelligence," planning to integrate cameras into its wearables, starting with AirPods and the Apple Watch by 2027, according to Bloomberg's Mark Gurman.
This strategy aims to allow users to interact with Siri by describing what they see, building on the iPhone 16's new Camera Control button that offers AI-powered visual analysis. Apple envisions this hands-free, screen-less interaction as a core feature of future devices.
However, this ambitious plan faces challenges. The expensive and bulky Vision Pro, while technologically advanced, hasn't seen widespread adoption. Efforts to revamp Siri into a more capable assistant have also been repeatedly delayed, potentially until 2027. Moreover, Apple Intelligence's initial rollout has faced criticism, including a false advertising lawsuit, leading to a reshuffling of the AI team's leadership.
While innovative, the concept of camera-equipped wearables isn't entirely new. Meta's Ray-Ban glasses have already found some success, and Google's Lens and Project Astra are further along in visual AI development. Amazon also recently previewed its visually aware Alexa+, though it too has faced delays.
Apple's bet on Visual Intelligence represents a significant gamble as it attempts to create seamless, hands-free AI experiences. Whether this vision will overcome the current hurdles and prove worth the wait remains to be seen.
Reader Comments