I wore Oakley's Meta glasses and it brought one problem into focus

This article contains affiliate links, we will receive a commission on any sales we generate from it.Learn more A year ago, I reviewed first generation Ray-Ban Meta smart glasses and called them my favourite gadget.I had been wearing the sunglasses on and off for twelve months, and found them fun and functional, pairing a classic look with the ability to take photos and videos, listen to audio, and ask an AI assistant questions.A year later, I have been testing two further pairs of Meta-infused glasses.

One is the £399 Oakley Meta HSTN that have improved battery life and are for general wear, and the other is the £499 Oakley Meta Vanguard, wraparound frames designed for exercise and outdoor sports use.Both products provide a slick experience that is better than the first generation of similar glasses.While I can’t say either are subtle enough to masquerade as regular eyewear, I have enjoyed the same features offered by the Ray-Ban Meta: excellent video quality, improving AI smarts and the open-ear audio that makes listening to music and podcasts easy.What I’ve become more uncomfortable with is the whole concept of the glasses and the motives of the firm powering their technology.I’m not sure I can pull off either pair of Oakley Meta frames given their bold designs.The HSTN (pronounced “how-stuhn”) builds on top of a popular Oakey design that adds some straight lines to round frames, with an angled nose bridge.

The frames are necessarily thicker than normal glasses to house the battery, speakers and camera lens, which makes them more conspicuous.Perhaps it is my large head, but I find the glasses uncomfortable after an hour or so of wear.Unlike the Ray-Bans, they only come in one size, with the plastic arms pushing into the sides of my head painfully.I fared better with the Vanguard despite their loud design.Worn on several road and trail runs, they fit like a glove and don’t budge, offering the great combination of sunglasses, open headphones and, unlike the designs, a centralised camera placed in the nose bridge between the lenses.

This means you get the best angle for what you’re looking at: head on.The angle is wider than your phone’s camera, more like an ultra-wide lens, but shots are crisp, and I took far more frames of sunsets and countryside from my outdoor runs that the active use encourages compared to the HSTN.The Vanguard also connects to Garmin watches (only some models) to display run stats over photos you took on your run for social media (not my bag) and to read aloud your stats when running.Unfortunately despite the connection, it can’t give you stats updates when on a run, only afterwards.I really enjoy the Bluetooth speakers in both arms of both glasses.The little grilles direct open audio towards your ears, with a hidden touch panel on the right arm reacting to taps and swipes to play, pause and change volume.

They are basically open-ear headphones.A physical button on the top edge above this panel is a shutter button; push once to take a photo, hold and release to start a video recording.The video quality is more impressive here than the photos with video up to 3K resolution at 30 frames per second possible.Photos from the HSTN look wonky largely down to the camera being positioned above the top left lens.

The circle on the right is an LED light meant to alert people when you’re filming or photographing by staying lit or flashing once respectively.Both products provide a slick experience that is better than the first generation of similar glasses.I’ve come to feel this is a futile gesture from Meta.If the company was that concerned about covert filming, it wouldn't have poured millions of dollars into making a camera you can wear on your face.I can put a black sticker over the light easily and mask the light and film away even more unnoticed.The camera can also be used to ask Meta AI things.

“Hey Meta, what am I looking at?” prompts the glasses to snap a photo, a shutter sound piped through the speakers.The AI voice of your choosing (mine a slow-speaking, slightly smug Englishman by default) then attempts to tell me what I’m looking at.It’s designed for identifying places and objects.Holding out the discarded peel of the previous hour’s mandarin, my Meta man enthusiastically declares: “You are looking at an orange.” Not quite, mate.

Pretty good though.Asking what the weather was like where I am, the voice correctly identified my town, but went on to say it could not get that information.It is at its best when describing exactly what you’re looking at, like a chair or a messy desk, identifying the items.It’s a good tech demo with not much useful utility.This AI clearly needs to get better, and I feared my private photos and videos might be being used by Meta to train its AI models.

So, where do these photos and videos go when you’ve taken them? I asked Meta to explain.“Photos and videos captured either manually or by the built-in Meta AI voice commands are saved to your phone’s camera roll and are not used by Meta for training,” a Meta spokesperson told me via email.Good start.Subscribe to Tech Newsletter Invalid emailWe use your sign-up to provide content in ways you've consented to and to improve our understanding of you.

This may include adverts from us and 3rd parties based on our understanding.You can unsubscribe at any time.Read our Privacy PolicyThey said the glasses do collect "additional data” which is used “to develop and improve new Meta Products”.

This includes “how often and how you use your devices” and “analytics regarding device performance”.So, your usage data is studied, but allegedly not your private photos.But I also wanted to know if there are any situations where media from your glasses is used to train Meta’s AI products in any way,“The photos and videos that you capture on your Ray-Ban Meta are on your phone’s camera roll and not used by Meta for training, including photos or videos captured by using the "Hey Meta, take a photo/video" voice command,” the spokesperson said.“However, if you use Meta AI experiences like live AI or other multimodal features (e.g.“Hey Meta, tell me more about this flower”), then the images or videos you take could be used for training purposes.”They said ‘Ray-Ban Meta’ in this response, but this applies to all Meta AI glasses.

The rules are different for voice queries and voice recordings.“Voice recordings and voice queries made to Meta AI are used to improve and personalise your user experience and to develop and improve Meta Products, which includes training AI models,” the spokesperson said.So, when you use Meta AI, your voice recordings and images or videos “could” be used to train Meta’s AI....when you use Meta AI, your voice recordings and images or videos “could” be used to train Meta’s AI.“By analysing voice recordings and queries, Meta’s AI models can learn and adapt, leading to better accuracy, more natural conversations, and overall enhanced performance of Meta AI.This information is collected, used, and retained in accordance with Meta AI’s Terms of Service and the Supplemental Meta Platforms Technologies Privacy Policy.”The spokesperson also said that should a Meta glasses user take the time to object to Meta’s use of their data to train AI, which anyone can do here, it would not stop the glasses from working, but “could impact the relevance of Meta AI's responses and the overall quality of Meta”.If all of this sounds like too much fuss, you may well be right.This is not the only tech product to use my personal data.

I understand that Gmail scans my emails.But the act of strapping a camera to your face that you paid for and then having to allow Meta to use your data to improve its AI products that make it money is a little too stark for me to truly be OK with.This is quite different to how I felt about these gadgets a year ago, now that I have thought more about the privacy implications.After some time with the Oakley Meta glasses, I’ve come to a few conclusions.

The first is that I have a huge head (about 61cm), and these designs are simply too tight for me, which spoils the experience.The other is that the Vanguard makes the most sense: sunglasses for running and other sports with a camera for social media and headphones for music, plus integration with Garmin watches.It’s a shame they cost the most at £499, but they make the most sense as an exercise-infused social media gadget.But my final conclusion is that Meta’s AI glasses are, for me, no longer worth the data and privacy trade-off despite their undeniably fun premise.

Read More
Related Posts