Meta Ray-Ban Display Hands-On: A Flawless Wristband For Flawed Glasses

Wait 5 sec.

Meta Ray-Ban Display is undeniably impressive technology, but it has notable flaws, while the Meta Neural Band works so well it feels like magic.If you somehow missed it: Meta Ray-Ban Display is the company's first smart glasses product with a display of any kind. Unveiled at Connect 2025 this week, the product will hit physical US retailers on September 30, priced at $800 with Meta's long-in-development sEMG wristband, called Meta Neural Band, in the box.Meta Ray-Ban Display Is Official & Includes Meta Neural Band For $800The Meta Ray-Ban Display glasses are official, after months of rumors and a last-minute leak, priced at $800 with the Meta Neural Band included.UploadVRDavid HeaneyYou can read about the specifications, features, and availability of Meta Ray-Ban Display here. But what is it actually like to use? At Meta Connect 2025, I found out.A Note On My Demo ExperienceMeta invited UploadVR to Connect 2025 and provided accommodation for two nights. But it did not provide us with a Meta Ray-Ban Display demo ahead of the keynote. Outlets with much smaller reach than the leading XR news site received private behind-closed-doors demos before the product was unveiled, but UploadVR did not. Last year, we didn't get a demo of the Orion AR glasses prototype at all, but developer Alex Coulombe shared his impressions on our site.My time with the glasses and wristband took place much later in the day, alongside other attendees at Meta's communal hardware demo area, where I had less time to explore what the devices are capable of, and was constrained by being shepherded along a route as part of a group.The communal demo area building. Just below this shot was a line of dozens of attendees, but I chose not to capture images of them, as gaining their consent would have been impractical.Whatever Meta's reason for not including UploadVR in the pre-keynote private demos, that's partially why this article is coming to you later than other sources. I flew across the Atlantic to attend Connect. By the time of my informal demo, I had spent all day writing up my Horizon Hyperscape impressions and the keynote announcements, and it was 4am in my home time zone. Afterwards I slept, and upon waking it was time to cover the developer keynote announcements. After this I tried Meta Ray-Ban Display for a second time, and then headed to the airport for the 10-hour flight home.While some of the other hands-on impressions of Meta Ray-Ban Display I've read and watched seem to follow a similar format, with repeated phrases that suggest distributed talking points, the following are my unfiltered opinions. I wasn't given a private walkthrough, so I'm not burdened with any priming for my thoughts.Style, Form Factor & Light LeakOther than the cost, the primary tradeoff of adding a display to smart glasses is that it also adds weight and bulk, especially to maintain acceptable battery life.Meta Ray-Ban Display weighs 69 grams, compared to the 52 grams of the regular Ray-Ban Meta glasses, and 45 grams of the non-smart Ray-Ban equivalent. It's also noticeably bulkier. The rims are thicker, and the temples even more so.Ray-Ban Meta (left) vs Meta Ray-Ban Display (right)Ray-Ban Meta (left) vs Meta Ray-Ban Display (middle) vs Xreal One Pro (right)Ray-Ban Meta (left) vs Meta Ray-Ban Display (right)Ray-Ban Meta (left) vs Meta Ray-Ban Display (middle) vs Xreal One Pro (right)With the regular Ray-Ban Meta glasses, people unfamiliar with them almost never clock that you're wearing smart glasses. The temples are slightly thicker than usual, but the frames are essentially the same. It's only the camera that gives them away. With Meta Ray-Ban Display, it's clear that you're not wearing regular glasses. And I don't mean due to the display.The clickbait YouTube thumbnails you may have seen are fake — none of the nearby Connect attendees I asked could tell whether I even had the display on or not (Meta says the display has just 2% light leakage). But the noticeably increased thickness of the entire frame suggests that something's not normal. These glasses are unavoidably chunky.How much that matters will vary greatly from person to person. For some people it won't matter at all. These days, thick framed glasses can even be a fashion choice. For others it will be a total dealbreaker. I'll be extremely curious to see how much the bulk affects sales and retention, as it should get worse with eventual true AR glasses.A selfie I took while wearing the 'Sand' color.When it comes to comfort, few people not employed by Meta have had enough time with these glasses to say whether they're comfortable enough for all-day wear. For me, in the two 15-20 minute sessions they felt perfectly fine, and I felt no discomfort at all. But this isn't enough time to make a true assessment.UploadVR plans to purchase a unit for a review next month – we haven't heard anything about getting a unit from Meta – and we'll wear the HUD glasses until the battery runs out to answer the comfort question more definitively.Meta Neural Band & The HUD User ExperienceLike the regular Ray-Ban Meta glasses, you can control Meta Ray-Ban Display with Meta AI by using your voice, or use the button and touchpad on the side for basic controls like capturing images or videos and playing or pausing music. But unlike any other smart glasses to date, it also comes with an sEMG wristband in the box, Meta Neural Band. How Does Meta Neural Band Work? Meta Neural Band works by sensing the activation of the muscles in your wrist which drive your finger movements, a technique called surface electromyography (sEMG).sEMG enables precise finger tracking with very little power draw, and without the need to be in view of a camera.Meta Neural Band should get around 18 hours of battery life, Meta claims, and it has an IPX7 water rating. In its current form, Meta Neural Band is set up to detect four finger gestures:Thumb to middle finger pinch: double tap to toggle the display on/off, or single tap to go back to the system menu.Thumb to index finger pinch: how you "click".Thumb to index finger pinch & twist: to adjust volume or camera zoom, as you would a physical volume knob.Thumb swiping against the side of your index finger, like a virtual d-pad, which is how you scroll.Meta also plans to release a firmware update in December that will let you finger-trace letters on a physical surface, such as your leg, to enter text. It sounds straight out of science fiction, and The Verge's Victoria Song says it works "shockingly well". But I wasn't able to try it.The Meta Neural Band gestures control a floating fixed heads-up display (HUD) visible only to your right eye, positioned slightly below and to the right of center.This fixed HUD covers around 20 degrees of your vision. To understand how wide that is, extend your right arm fully straight and then turn just your hand 90 degrees inward, keeping the rest of your arm straight. To understand how tall, do the same but turn your hand upwards. I apologize for the physical discomfort you just experienced, but you've now learned how little of your right eye's vision Meta's HUD occupies. 0:00 /0:10 1× This clip from Meta shows the 3 tabs of the system interface, and how you swipe between them. Meta's system interface for the HUD looks much like that of a smartwatch, which makes sense given the 600×600 resolution. It has 3 tabs, which you horizontally scroll between:The center home tab (the default) shows the date and time, your notifications, any playing music, and a Meta AI button.The right tab is the two-column app library: WhatsApp, Instagram, Messenger, Messages, Calls, Camera, Music, Photos, Captions, Maps, Tutorials, and Hypertrail (a game). Four rows are shown at a time, and you scroll vertically to see the rest.The left tab features quick controls and settings like volume, brightness, do not disturb, as well as shortcuts to Captions, Camera, and Music.The problem is that, given the form factor and input system, this is all just too much. It's too finicky, and takes too many gestures to do what you want. The interface almost certainly could, and very much so should, be significantly streamlined.The current home tab feels like a waste of space when you don't have any notifications, showing just the time and a Meta AI button. You should invoke Meta AI through a custom finger gesture, not a UI button, and the home tab should instead have shortcuts to all the other key functionality of the glasses.Interestingly, all the way back at Connect 2022, Meta showed a prototype demo clip of exactly this. The prototype HUD had a singular tab, with the date, time, weather and latest notification in the center, and shortcuts to the camera, messages, music, and more apps above, to the right, below, and to the left respectively. I'd much prefer to have this in smart glasses than what Meta has today. What happened to it? 0:00 /0:22 1× Meta Connect 2022 demo of a prototype of the sEMG wristband and HUD. Why doesn't the shipping product's interface look like this? A lot of the friction here will eventually be solved with the integration of eye-tracking. Instead of needing to swipe around menus, you'll be able to just look at what you want and pinch, akin to the advantages of a touchscreen over arrow keys on a phone. But for now, it feels like using MP3 players before the iPod, or smartphones before the iPhone. sEMG is obviously going to be a huge part of the future of computing. But I strongly suspect it will only be one half of the interaction answer, with eye tracking making the whole.The other problem with navigating and interacting on Meta Ray-Ban Display is that it's sluggish, with frequent lag across both of my sessions, which involved two separate units. The interface looked sluggish at times during Mark Zuckerberg and Andrew Bosworth's keynote demo too.To be clear, the problem here is clearly the glasses, not the wristband. I know that because the wristband provided immediate haptic feedback when a gesture was recognized, and any time lag happened, the frame rate of animations slowed down too.Meta confirmed to UploadVR that Meta Ray-Ban Display is powered by Qualcomm's original Snapdragon AR1 Gen 1 chipset, the exact same as used in 2023's Ray-Ban Meta glasses, their new refresh, and both Oakley Meta smart glasses. I specifically asked whether this was a misprint, given that Qualcomm announced the new higher-end AR1+ chip back in June. But Meta confirmed that it was not a typo. The glasses really are using a two year old chip – and it shows. (If you're in the VR space: does this remind you of anything?)The Meta Neural Band itself picked up every gesture, every time, with a 100% success rate in my time with it. It works so well that it's hard to believe it's real. The volume adjustment gesture, for example, where you pinch and twist an imaginary knob, feels like magic. But the glasses Meta Neural Band is paired with today let it down.Meta Neural Band in Black (left) and Sand (right).While I've heard some people complain that the Meta Neural Band felt too tight, I suspect that their Meta handlers were prioritizing making sure that it performed as well as possible over comfort. In my second demo I adjusted mine to the same tightness I would for my Fitbit, and found the gesture recognition to remain flawless. It just works, and its woven mesh material felt very comfortable.Speaking of Fitbit, there's clearly enormous potential here for Meta to evolve the wristband to do more than just sEMG. During the Connect keynote, the company announced Garmin Watch integration for all its smart glasses. But for Meta Ray-Ban Display, why can't the Meta Neural Band itself collect fitness metrics? I imagine this will be a big focus of successive generations.The Monocular ProblemThe image delivered to your right eye by Meta Ray-Ban Display is sharp and clear, albeit translucent, with higher angular resolution than Apple Vision Pro. But that your left eye sees nothing is a major flaw.If you're a seasoned VR user, you've likely run into a frustrating bug, or played a mod, where a user interface element, shader, or effect only renders in one eye. In smart glasses, it feels just as bad.A lot of people have asked me whether you could use Meta Ray-Ban Display to watch a video while on the go. And my answer is that while the glasses actually already do let you watch videos sent to you on Instagram and WhatsApp, I wouldn't actually want to watch them. It's not something I want to do with any monocular display.Throughout both my Meta Ray-Ban Display sessions, when the display was on I experienced eyestrain. The binocular mismatch induced by a monocular display is just downright uncomfortable. And to be clear, I've experienced this before with other monocular glasses – it isn't unique to Meta's technology.I tried and failed a dozen times to capture a through-the-lens shot that truly represents what you see, so here's a Meta marketing shot instead.I'd want to use Meta Ray-Ban Display to briefly check notifications without needing to look down, and for occasionally glancing at a navigation route. But I definitely wouldn't want to keep the display up for longer than this, and that rules out use cases like video calling.Reading and watching impressions of Meta Ray-Ban Display from other outlets and influencers, I've been surprised at how many don't explore the monocular issue at all, beyond briefly mentioning it as a minor tradeoff.Could I just be more sensitive to binocular rivalry than most people? Maybe. But it was a sentiment I heard within earshot from other attendees demoing the glasses too."I've actually heard that all day", an event staffer guiding the demo groups remarked, regarding the visually uncomfortable feeling of the display only showing to one eye.Why Meta’s HUD Glasses Will Only Have A Display In One EyeWhy will Meta’s first HUD glasses only have a display in one eye? An explanation from Meta’s CTO seems to provide the reason.UploadVRDavid HeaneyThe problem here is that Meta's CTO Andrew Bosworth is absolutely correct to point out that the components for a binocular display system would be more than twice as expensive as a monocular one, since it also requires implementing disparity correction. It would also drive up the bulk and weight even further.Based on my on background conversations with people in the LCOS and waveguide supply chains, had Meta decided to go with a binocular design, including the extra battery and disparity correction required, I'd estimate the product would probably have ended up around $1200 at least, with a weight of around 85 grams.And yet, all that said, I'd still argue it would have been worth it. Meta Ray-Ban Display is already by its nature an early adopter product, and the form factor already will only appeal to those who don't care about its bulk. I may be wrong, but I suspect the early adopters of HUD glasses will gravitate to binocular options, once they're available. Which of the major players will be the first to offer it?Conclusions & MusingsIn many ways, the Meta Ray-Ban Display glasses remind me of the Meta Quest Pro headset. It too was held back by having an outdated chip and only one of a key component (in its case a color camera, in Meta Ray-Ban Display's case a display).Don't get me wrong here. I could imagine Meta Ray-Ban Display being immensely useful for checking notifications and directions on the go. But it's tantalizingly close to being so much more. If it was binocular, had eye tracking, and was driven by a slightly more powerful chip, I think I'd want to use it all day long.As you might expect, Meta did not let me bring the glasses home. But it did give me this cool little pin.But we're not there yet, and the product I tried was finicky and monocular.In the present, as a pair, the glasses and wristband deliver equal parts frustration and delight. But they also fill me with excitement for the future. The Meta Neural Band works so well it feels like magic, and once it's paired with better glasses, I strongly suspect Meta could have its iPod moment.Its iPhone moment, on the other hand, will have to wait for true AR.Why Meta’s Live Glasses Demos Failed On-Stage At ConnectMeta CTO Andrew Bosworth explained the technical reasons why the key Connect 2025 keynote smart glasses live demos failed.UploadVRDavid Heaney