Apple is accelerating development of three AI-powered wearables, with camera-equipped AirPods expected to launch by late 2026. According to Bloomberg’s Mark Gurman, the company is pushing forward with smart glasses, an AI pendant, and upgraded AirPods—all built around Visual Intelligence and the Siri digital assistant.
Camera AirPods: The First AI Wearable
The camera-equipped AirPods represent Apple’s first major step into AI wearables. Unlike traditional earbuds, these will feature low-resolution infrared cameras designed to provide AI with visual context about your surroundings.
According to MacRumors, the cameras won’t capture photos or videos for users—they exist solely to feed information to Apple Intelligence. This allows Siri to understand what you’re looking at and respond to questions about your environment without pulling out your iPhone.
9to5Mac argues AirPods make perfect sense as Apple’s first AI wearable. Unlike smart glasses or AI pins that face adoption barriers, AirPods are already ubiquitous. Adding camera functionality to an existing popular product category reduces risk compared to launching entirely new form factors.
What Visual Intelligence Enables
Visual Intelligence, currently available on iPhone 15 Pro and newer, lets users point their camera at objects to get instant information. The technology can identify landmarks, translate text, search for products, and answer questions about what the camera sees.
With camera AirPods, this capability becomes hands-free. According to AppleInsider, users could ask about a building they’re walking past, get directions while navigating on foot, or have text read aloud—all without touching their phone.
CEO Tim Cook has been dropping hints about Visual Intelligence’s importance, similar to how he foreshadowed health sensors before Apple Watch and augmented reality before Vision Pro. The pattern suggests Visual Intelligence will be central to Apple’s next product category.
Apple Smart Glasses Coming 2027
Apple’s smart glasses will compete directly with Meta’s Ray-Ban partnership. According to Bloomberg, Apple provided hardware engineering prototypes to teams recently and targets a 2027 launch, with production potentially beginning December this year.
The glasses will feature a high-resolution camera capable of capturing photos and videos, plus a second camera providing environmental context to Siri. Expected capabilities include making calls, listening to music, taking photos, getting navigation directions, and using Visual Intelligence to read event details and add them to calendars.
MacRumors reports Apple is focusing on build quality as a key differentiator from Meta’s Ray-Bans. Rather than partnering with existing eyewear brands, Apple will design proprietary frames using high-end materials including acrylic elements for a premium feel. Multiple sizes and colors will be available.
The AI Pendant: “Eyes and Ears” for iPhone
The most unusual device is an AI pendant that can clip to clothing or wear as a necklace. Some Apple employees reportedly call it the “eyes and ears” of the iPhone, according to Texarkana Gazette.
Like the camera AirPods, the pendant features a low-resolution always-on camera plus a microphone for Siri input. It’s designed as an iPhone accessory rather than a standalone product—avoiding the pitfalls of failed devices like the Humane AI Pin.
The pendant emerged from Apple’s industrial design team while working on the glasses, before settling on a final glasses design. If the product continues development, it could launch as early as 2027, though it may still be canceled.
Why Apple Is Betting on Visual Intelligence
Visual Intelligence represents Apple’s answer to how AI should interact with the physical world. Rather than requiring users to describe everything verbally, cameras give AI direct visual context—making interactions faster and more natural.
The strategy also differentiates Apple from competitors focusing purely on text-based chatbots. OpenAI and Meta are pursuing AI wearables too, making this an emerging battleground in AI hardware.
According to SoundGuys, a premium AirPods Pro 3 model with infrared cameras could launch this year at approximately $299—$50 more than the current $249 model. Standard AirPods Pro 3 without cameras would maintain the lower price.
The Siri Challenge
Apple’s AI wearables strategy depends heavily on a dramatically improved Siri. According to Bloomberg, upgrades to Siri have faced development snags and delays. A chatbot-like interface is planned for iOS 27 later this year, but the timeline keeps slipping.
Apple has partnered with Google to use Gemini AI models for certain Siri capabilities while developing its own visual models. This hybrid approach acknowledges Apple can’t build everything in-house immediately.
What This Means for Consumers
For Apple fans, the AI wearables strategy signals the company’s next major product direction. Camera AirPods launching this year provide an accessible entry point—users can try Visual Intelligence features without committing to new form factors like smart glasses.
The staggered rollout—AirPods in 2026, glasses in 2027, pendant potentially later—lets Apple test features, gather data, and refine the experience before committing to full-scale production across all three categories.
Competition is heating up rapidly. Meta’s Ray-Ban smart glasses are gaining traction. Startups are launching AI pins and wearables. Apple’s entry brings massive scale and ecosystem integration, but success depends on delivering genuinely useful features rather than tech demos.
For now, keep an eye on fall 2026 for potential AirPods Pro 3 announcement. If rumors prove accurate, they’ll mark the beginning of Apple’s AI wearables era—transforming how we interact with artificial intelligence in daily life.
Read more tech related article here.


Leave a Reply