Meta has launched a new generation of smart glasses and wearable AI tools, including the first mainstream Ray-Ban smart glasses with an in-lens display and a Neural Band that lets users control digital content with tiny hand gestures.

Zuckerberg Reveals the New Line-Up at Meta Connect

The announcement was made by Meta CEO Mark Zuckerberg during the company’s recent (annual Meta) Connect conference, at Meta’s Menlo Park headquarters in California. In front of a live audience, he introduced three new smart glasses models alongside the debut of the Meta Neural Band, a wrist-worn controller designed to detect electrical signals from the forearm and translate them into digital inputs.

According to Zuckerberg, the technology represents a “huge scientific breakthrough” and forms a key part of Meta’s strategy to embed AI into wearable devices. The new glasses are powered by Meta AI, the company’s voice-activated assistant, and are designed to bring augmented reality (AR) features to everyday eyewear.

Three New Models With Different Uses in Mind

The headline product is the Meta Ray-Ban Display, priced at $799 (£585), which features a colour display embedded into the right lens. This allows users to see WhatsApp messages, view live video calls, and access real-time information such as captions, translations, or walking directions directly in their line of sight. A 12-megapixel front-facing camera enables photos and video recording, and a microphone and speaker system support voice calls and Meta AI commands.

Also announced were the Oakley Meta Vanguard glasses, retailing at $499 (£390), aimed at sports and outdoor users. These include an ultrawide camera, a rugged waterproof design (IP67-rated), and integration with fitness tracking services like Strava and Garmin. Finally, Meta also launched the Ray-Ban Meta (Gen 2) glasses for $379 (£295), which have a more classic design while adding better cameras, extended battery life, and upgraded video features such as slow-motion and hyperlapse recording.

All three models are essentially being positioned as steps towards a more immersive and hands-free computing experience, thereby removing the need for users to constantly check phones or carry separate devices.

The Neural Band (Replacing the Keyboard With Your Hand)

What makes this release particularly notable is the integration of the Meta Neural Band, a wearable bracelet that detects subtle hand gestures using electromyography (EMG). EMG reads the small electrical impulses generated by muscle movement. In Meta’s case, this translates to pinches, taps, swipes, and even drawing letters on the user’s leg or desk to send text messages, no screen or keyboard needed.

The Neural Band allows users to control the glasses without even touching them, thanks to AI models trained to recognise specific gestures and context. For example, swiping a thumb across the index finger can scroll menus, while tapping fingers together can wake or sleep the display. There is also a double-thumb tap gesture to activate Meta AI without saying its wake word.

Meta says the Neural Band will initially only be sold in the US due to the need for in-store wrist fitting. It will roll out to other markets, including the UK, in early 2026.

Meta’s Long-Term AI Hardware Ambitions

This latest release highlights Meta’s growing focus on AI hardware, with Zuckerberg stating earlier this year that the company intends to spend “hundreds of billions” on AI infrastructure and data centres in pursuit of what he calls “personal superintelligence”.

The new glasses are part of a broader strategy to create everyday devices that blend AI with human senses. In a July earnings call, Zuckerberg said he believes smart glasses will become so important that “people who don’t wear them will be at a significant cognitive disadvantage.”

Meta has sold around two million smart glasses since its partnership with Ray-Ban began in 2023, though it does not disclose exact figures. With the addition of display features, Meta is hoping to create a more compelling reason for wider adoption.

What Can They Do?

In terms of functionality, the Ray-Ban Display allows users to view content such as messages, calls, maps, or translations overlaid on the real world. For example, during a walk, the glasses can provide turn-by-turn directions without needing to check a phone. Similarly, when in conversation with someone speaking another language, the glasses can show translated captions live on the lens.

Voice remains a key interface, but Meta now believes combining visual and gesture controls will significantly enhance the user experience. The glasses are powered by Meta’s own large language models, and the company claims performance is improving rapidly with each update.

The Oakley Meta Vanguard model is clearly targeted at fitness and sports users. As such, it can automatically capture moments during activities like cycling or skiing, using sensor data to determine milestones such as speed or altitude reached. Also, after an activity, users can overlay stats from Garmin or Strava onto videos or photos.

Awkward Launch

Despite the ambition, the launch has not been without glitches. For example, during the live demo, Zuckerberg struggled to place a WhatsApp call using the glasses, telling the audience: “I don’t know what to tell you guys. I keep on messing this up.”

There also appears to be some limitations in functionality. For example, at launch, Spotify integration will only support playback controls and track display. Instagram use is limited to Reels and direct messages. Meta says more features will roll out in software updates.

Comfort and accessibility are other factors where there may be some issues. There have been reports that the display works well when viewed through one eye but reading it with both eyes can feel disorientating. Meta says the experience takes some getting used to.

Privacy, Safety and Scrutiny

Not surprisingly, there have been some questions raised about safety, privacy, and the impact on younger users. The glasses include a small LED to alert others when the camera is recording, but critics say more robust protections may be needed.

Also, on the same day as the launch, protests took place outside Meta’s New York headquarters. Campaigners, including parents of children who died by suicide, demanded greater protections for minors across Meta’s platforms, including Facebook, Instagram, and its VR products. Meta denies accusations of negligence, calling them “nonsense.”

Earlier testimony from two former safety researchers accused the company of suppressing internal studies on potential harm to children. While unrelated to the glasses directly, this scrutiny continues to shadow Meta’s broader product ecosystem.

Competition

Meta’s bet on smart glasses puts it in direct competition with other tech giants exploring wearable AI, including Apple and Google. For example, Google previously attempted a heads-up display with Google Glass, which failed to gain traction. It seems Meta is now trying to succeed where others fell short by integrating AI and voice in a more consumer-friendly format.

According to Forrester analyst Mike Proulx, “Unlike VR headsets, glasses are an everyday, non-cumbersome form factor,” but he added that Meta must still “convince the vast majority of people who don’t own AI glasses that the benefits outweigh the cost.”

Meta has already invested around $3.5 billion in eyewear brand EssilorLuxottica, which owns Ray-Ban and Oakley. This suggests a long-term commitment to making smart glasses a central platform for AI integration.

Business adoption also seems to remain a bit of an open question. For example, the hands-free and real-time capabilities of the glasses could appeal to sectors such as logistics, field service, or retail, where instant access to information can improve productivity. However, questions around price, practicality, and security may limit short-term uptake.

What Does This Mean For Your Business?

Practical use cases will likely determine how quickly these devices gain ground, particularly in business settings. In sectors where on-the-go access to visual data and communication tools is critical, such as warehousing, technical services, healthcare, or even frontline retail, Meta’s smart glasses could actually offer a viable alternative to phones or tablets. Being able to receive instructions, translate conversations, or log information using only hand gestures or voice commands could reduce friction, speed up workflows, and create safer, more efficient environments. UK businesses in particular may find opportunities here, especially where hands-free communication or multilingual interaction is valuable.

At the same time, concerns around user privacy, data collection, and digital wellbeing are not going away. The Neural Band introduces a level of biometric input that, while technically impressive, may prompt further debate around consent, surveillance, and data ethics. These are especially sensitive issues for organisations operating in regulated environments, or those managing public-facing staff.

Meta’s heavy investment in AI hardware signals a longer-term ambition to dominate wearable computing, but it also raises the stakes. If the technology fails to deliver clear value or gain mainstream traction, the company could face pressure over its direction and spending. Likewise, businesses considering adoption will need to assess not just functionality, but also durability, support, integration with existing systems, and long-term viability.

The glasses may well become more than a consumer gadget. If Meta can refine the experience, prove the use cases, and address lingering trust issues, the products unveiled this month could mark an early step towards a wider transformation in how people interact with digital tools, and how AI becomes embedded in daily professional life.