The Meta Ray-Ban Display Glasses are the first wearable I’ve come across that genuinely hints at a future beyond gimmickry. They’re not fully fledged augmented reality, but they represent something more subtle: a pair of stylish sunglasses with a hidden glanceable display, a neural wristband for control, and a layer of software that brings the whole thing to life.
The glasses themselves feel like an evolution of the Ray-Ban Meta line. On the outside, they’re still recognisably Ray-Bans, but under the hood the hardware has levelled up. The camera now records at a much higher resolution, there are five microphones spread across the frame, and transition lenses adjust automatically between indoors and outdoors. The biggest leap, though, is the new display — a projected window that appears just off to the side of your right eye. It’s not mapped to your environment, so you can’t pin virtual objects to the world around you, but it’s incredibly bright, completely private, and surprisingly useful. All of this is powered by a custom battery packed into the arms of the glasses, yet the whole device still weighs only 69 grams. After a while, you genuinely forget you’re wearing them.
The neural band is where the control magic happens. Worn on your wrist like a watch, it reads the electrical signals from your muscles so you can interact with the glasses using subtle hand gestures. Within an hour the system felt natural — pinches, double taps, and rotations quickly become second nature. What struck me was not just the accuracy but the lack of false triggers. It reminded me of the intuitive control you get with Apple’s Vision Pro, but without the reliance on cameras and without the need for dramatic gestures in public.
The software layer is the weakest part of the package, though still impressive in moments. The UI is functional rather than slick, and the app store is tiny, mostly limited to Meta’s own versions of core tools. Yet the features that are there genuinely work. WhatsApp, for example, is transformed; you can read full messages and respond with dictation that’s startlingly accurate, even when whispering. Captions and real-time translation feel close to magic, with the glasses isolating a speaker’s voice from noisy environments and presenting text almost instantly. Navigation makes obvious sense, with the arrow on the screen rotating as your head turns, while media like Spotify plays back with better-than-expected audio. Even small flourishes, like being able to scribble words on your leg when you can’t speak, hint at new behaviours we’ve never had to design for before.
Why this matters for marketers is simple: these glasses are a glimpse of what’s coming. Notifications no longer buzz in your pocket but appear quietly in your vision, which means the demand for brevity and relevance will skyrocket. Voice-to-action combined with on-screen prompts could reshape the way consumers discover and buy, turning hands-free commerce into a default. Short-form video and messaging may evolve into micro-glance formats built for this kind of screen. Accessibility is another big factor; built-in captions and translation create seamless cross-language communication, which could transform how brands think about inclusivity in their campaigns. And then there’s the cultural element: because these glasses look like ordinary Ray-Bans, adoption won’t require people to walk around with headsets strapped to their faces. If Meta succeeds in normalising smart glasses as fashion, marketers will need to anticipate a mainstream audience who casually wears devices capable of capturing, displaying, and translating the world around them.
The verdict is that at $799, this isn’t yet a mass-market product. The app ecosystem is limited and the experience is still tethered to your phone. But it’s a strong signal of intent. Meta wants to plant its flag in the wearables space before Apple inevitably drives it forward, and this early step is closer to convincing than anything we’ve seen before. For marketers, it’s not about rushing to advertise on the platform — it’s about preparing now. Experiment with content designed for moments, not feeds. Think about how your brand translates when the interface is hands-free and glanceable. And be ready for the day when “screen time” doesn’t necessarily involve a phone in hand.