News Update

Apple Targets 2026 Launch for Smart Glasses in Bid to Catch Up in AI Hardware Race

Apple is preparing to enter the AI-powered wearables arena with the release of its first smart glasses by the end of 2026, positioning itself to compete directly with Meta and OpenAI in a rapidly evolving tech landscape.

According to sources familiar with the company’s plans, Apple engineers are accelerating development of the glasses in order to meet the ambitious launch timeline. Mass production of prototypes is expected to begin later this year through international suppliers. The wearable device is anticipated to feature a blend of AI functionalities including voice commands via Siri, spatial audio, and camera-based environmental sensing.

The move comes amid intensifying competition in the AI hardware space. Just this week, OpenAI announced a hardware partnership with former Apple design chief Jony Ive, aiming to launch a range of AI-first consumer devices. The collaboration follows OpenAI’s acquisition of Ive’s secretive hardware startup, io.

Apple’s planned smart glasses—internally renamed from project “N50” to “N401”—are said to include microphones, speakers, and cameras that will enable features such as live translations, phone call handling, navigation, and media playback. Their function will resemble Meta’s current Ray-Ban smart glasses, but Apple is expected to focus heavily on superior hardware design and build quality.

While Apple has long-term ambitions to deliver full augmented reality (AR) eyewear, those products remain years away due to technical and usability challenges. For now, the 2026 model will offer a lightweight, non-AR experience designed to introduce users to AI-integrated wearables in a familiar form factor.

A dedicated chip for the glasses is already in development, with Apple aiming to begin chip manufacturing in 2026 as part of its broader hardware ecosystem strategy.

Internally, some teams have voiced concern that Apple’s current lag in AI innovation may impact the smart glasses’ success. Meta’s Ray-Bans are powered by the company’s Llama AI models, and new Android-based wearables from Google will leverage its Gemini AI. Apple, by contrast, still leans on Google Lens and OpenAI integrations for real-world visual analysis on the iPhone, though it is reportedly working to bring more AI in-house.

The same team behind the Vision Pro headset—the Vision Products Group—is overseeing the glasses project. They are also developing new iterations of the mixed-reality headset, including a lighter and more affordable model, and another designed to work in tandem with Mac computers for low-latency applications.

Plans to equip the Apple Watch and AirPods with cameras were also explored. A camera-equipped Apple Watch Ultra had been in the works with a potential 2027 launch date but was shelved this week. However, efforts to develop camera-enabled AirPods continue.

Despite challenges, Apple remains committed to expanding its presence in the AI device segment. After falling behind rivals in implementing advanced AI features across iPhones, iPads, and Macs, the company has ramped up efforts to improve its proprietary large language models and open them to third-party developers. This could pave the way for a new generation of AI-enabled apps on the App Store.

Additionally, Apple is aiming to release its first foldable iPhone in late 2026, aligning with trends already embraced by other leading smartphone manufacturers. Fresh product designs are also being lined up for 2027.

Apple has previously abandoned several wearable-related initiatives, including a plan to launch early AR glasses tethered to a Mac. However, the company’s ongoing push into AI-enhanced hardware suggests that it still sees the wearables sector as key to its future growth.