Put the glasses on and something shifts – less in the tech, more in the room.
Others are reading now
People notice. Not always in an obvious way, but enough. A second glance. A pause before speaking. Someone checking your face a fraction longer than usual.
According to a report by The Guardian’s Elle Hunt, strangers sometimes ask if they’re being recorded. It’s an awkward question because it’s not unreasonable.
The device can film from eye level, and the recording light is easy to miss unless you’re looking for it. Most people aren’t.
That uncertainty lingers. It doesn’t stop interactions, but it nudges them off balance. Conversations carry on—just slightly altered, like something small has gone out of alignment.
A device still finding its role
Meta said it sold more than 7 million pairs of smart glasses in 2025, pitching them as a step toward hands-free computing. No phone in your hand. Less screen time. That’s the idea.
Also read
In reality, it’s hit and miss.
The AI assistant can describe what you’re looking at, read text, respond to voice commands. But as Hunt documents, it mishears things, drops out mid-task, or gives half an answer. You end up reaching for your phone anyway. Again.
Some features sound more impressive than they feel. Translation slows conversations down instead of smoothing them out. Visual recognition tends to state the obvious – accurate, technically, but not especially helpful.
There are parts that work. The open-ear audio is genuinely good. And groups like Be My Eyes, along with visually impaired users, have pointed to the potential for real-time assistance. That use case feels solid.
Beyond that, it’s less clear. Ben Wood of CCS Insight has said in industry commentary that early wearables often arrive before their purpose is fully nailed down – “searching for a problem to solve.”
Also read
Which is kind of where this sits, writes Hunt.
The shift is behavioural, not technical
The more interesting change is quieter – and harder to shake.
Wearing the glasses introduces a constant possibility: You could record this. Not that you will. Just that you can. That thought flickers in and out.
Briefly. Then again.
Hunt notes how that alone starts to shape perception. Everyday moments – completely ordinary ones – begin to feel like potential footage. Not everything, but enough to notice.
Also read
You see it online already. First-person clips, eye-level, slightly disorienting. People in them often have no idea they’re being filmed.
Iain Rice, professor of industrial AI at Birmingham City University, told The Guardian: “If you see a person wearing them and don’t want to be recorded, unfortunately, the only way to make sure is by moving out of the way.”
It’s a blunt way of putting it. But it lands.
Recording used to require a decision – take out your phone, point it, press something. Now it can sit in the background. Always available. Easy to ignore until it isn’t.
Legally, not much has changed. In the UK, filming in public is broadly allowed. Socially, though, things feel less settled. The rules are fuzzier than the law suggests.
Also read
Not just a product, but a direction
Meta says users should act “in a safe, respectful manner.” Still, that leaves a lot unresolved.
Because this isn’t just about whether the glasses work. It’s about what they quietly introduce.
There are upsides. Accessibility tools could be genuinely important. Hands-free interaction has obvious appeal. And the technology will improve – it always does.
But right now, the trade-off shows up in small ways. A look that lasts a beat too long. A conversation that feels slightly off. A shared space that isn’t quite as neutral as it used to be.
Nothing dramatic. Just… different.
Also read
The hardware will catch up. That part is predictable.
Whether people get used to being casually, constantly recordable – that’s less certain.
Sources The Guardian, CCS Insight