This Man Spied on NYC for 2 Years Using Meta’s Smart Glasses, Here’s What He Learned
Artificial intelligence will be what fuels the new smart glasses wars
When I first delved into the world of virtual reality (VR) in 2016, most technology reporters shunned the innovation. That relative vacuum of coverage afforded me a great deal of space within which to experiment and blaze new investigative trails until VR went mainstream with the Meta Quest 2 in 2020.
That blank canvas has also allowed me to develop a nuanced understanding of immersive computing in terms of predicting where the technology might go next. Similarly, I’ve been testing smart glasses since 2018, when I first got my hands on the second generation of Snap’s Spectacles (as well as North’s Focals).
In fact, I documented much of my pandemic experience in New York City using the embedded cameras on Snap’s Spectacles 2 camera glasses that allowed for hands-free documentation of the world around me. From exhaustively capturing the independent street art movement that took over the luxury brand-filled streets of 2020’s post-riot Soho, to the I Am Legend-style emptiness of Times Square, the smart glasses allowed me to bike throughout the city using my eyes to frame the population’s struggle with Covid.
As a rule, the only time I recorded someone without alerting them was when I was in public, attempting to document history or even a possibly unfortunate incident. In social settings and business meetings, I always disclosed the fact that I was wearing smart glasses equipped with cameras, and sometimes even took them off to make the other person feel more comfortable. The lessons of Google Glass and its “glasshole” creepy factor failure have stuck with me, so I’m sensitive to anyone unacquainted with smart glasses.
Tech companies have been conducting research and development on our faces for several years to get here
However, beyond the broad use cases around wearable face cameras, I’ve always been far more intrigued by the possibilities inherent in virtual assistants embedded in smart glasses. My first taste of such a dynamic was in 2018 when North’s Focals (later acquired by Google) allowed users to interact with Amazon’s Alexa virtual assistant via voice.
Focals didn’t include an outward-facing camera, instead, the device displayed information to the wearer as text and icons, a kind of rudimentary, heads-up display version of augmented reality (AR). The unique display strained my eyes, and I’ve never been a big fan of Alexa as a convenient information tool, so I didn’t stick with the device long.
In 2019, Amazon released Echo Frames smart glasses, which did not include an AR information display, and focused on delivering audio (phone calls, music, podcasts) and Alexa integration. The experience was better than my Focals testing, but the first version of Amazon’s wearable made the user look like an uptight librarian instead of a cutting-edge techie carrying next-gen hardware. The functionality wasn’t worth walking around looking like a nerd from a cheesy ‘80s comedy, so I was out.
But then, in September of 2021, the company still known at the time as Facebook unveiled a collaboration with France-based Essilor Luxottica, the parent company of Ray-Ban. The two companies introduced Ray-Ban Stories, a product that took the classic Wayfarer frame design made iconic by the likes of James Dean and Tom Cruise and pushed them into the future. The new version of Wayfarers came embedded with speakers, microphones, and outward-facing voice-activated cameras. Since then, these smart glasses become my favorite mobile technology device second only to my Apple iPhone.
Innovation is sexy, good branding is sexier
Like many of us still shaking off the temporal confusion of the multi-year pandemic lockdown, the two years I’ve worn Ray-Ban’s smart glasses have flown by. Now when I look at my smartphone’s camera roll, I realize that most of the photos and videos I’ve taken have been through my Ray-Ban smart glasses. For that reason, it’s been a bit confusing to watch most of the public almost completely ignore the most stylish wearable tech product to come along since the Apple Watch.
Part of the problem was the branding. Facebook didn’t announce its switch to the name Meta until after the Ray-Ban announcement. So the tarnished reputation of Facebook, and how it handles information, was immediately attached to a pair of camera glasses that most people seemed eager to avoid just based on the name association alone. The idea of putting a pair of “Facebook cameras” on your face just didn’t sound sexy to most, even when packaged in the most beloved pair of shades in Hollywood history.
At the time, I considered the Facebook branding an obvious unforced error. It would have made a lot more sense to pair the company’s popular, photo and video-focused brand Instagram with Ray-Ban. There are no guarantees when it comes to marketing, but Ray-Ban Instagram smart glasses would have likely drawn a lot more users than the scant few apparent on the streets of major cities following the device’s initial release.
The new version of Ray-Ban’s smart glasses, now branded with the Meta name, have been met with more positive reviews, even though most of the features and designs are the same or only slightly improved compared to the first version. I still think Meta missed an opportunity to truly become the de facto smart glasses leader by branding the device as an Instagram product. But Meta is better than Facebook if you’re looking to draw in Millennial and Gen Z tastemakers, so that’s progress.
It turns out AR smart glasses will start with audio-enabled AI as the path to immersive computing
We’re still a ways off from the ultimate dream of fully AR-capable smart glasses that can be seamlessly hidden within the frames of a pair of Wayfarers, but up until now, the aforementioned features have been more than enough to make Ray-Ban’s device a sticky part of my mobile gear. The only thing missing was a virtual assistant that was better than Amazon’s Alexa. Pairing the photo, video, and audio features with a truly capable virtual assistant could make the device almost irresistible to all but the most dedicated Facebook/Meta haters.
Last month, Meta gave us exactly that. The new Ray-Ban Meta smart glasses come equipped with the company’s artificial intelligence (AI) assistant, Meta AI. On Oct. 17, the same day the updated device went on sale in Ray-Ban brick-and-mortar stores, Mark Zuckerberg demonstrated the feature. Using a personal moment with his child, he showed how the voice-activated Meta AI function could help a user do something as simple but meaningful as braid their child’s hair, using step-by-step instructions.
The AI functions in the Ray-Ban Meta smart glasses are, for now, limited, but the potential moving forward is apparent. In my view, this single on-device shift represents a new kind of “augmented reality” wherein your smart glasses can help you navigate the real world using an ever-present invisible assistant who puts nearly all available information at the user’s disposal, accessed with just a verbal command.
AI is the killer app that will take smart glasses from curiosity to mainstream must-have
The question isn’t if this is the next phase of mobile computing, the only question now is whether this hardware implementation (glasses) will be the preferred choice of mainstream consumers. Currently, there are several startups—Rewind Pendant, Humane, and Tab—focused on having users either wear a device around their neck or pin a device to their chest to interact with AI assistants. But this approach has an immediate roadblock to adoption: pendants and neck medallions are not a mainstream mobile device trend (yet).
Conversely, wearing shades, and even non-prescription eyeglass frames, is something nearly everyone does or has done. According to a 2016 study from Statista, 87% of consumers 18 to 34 wear sunglasses, and that number balloons to 90% in the 35 to 44 age range. Additionally, roughly 64% of U.S. consumers wear or have worn prescription eyeglasses. Eyeglasses, prescription or not, are a massive wearable technology opportunity yet to be fully leveraged.
The Rewind and Tab AI assistant devices don’t feature cameras, so they are limited compared to smart glasses that allow video and photo capture. And while the Humane device reportedly has a camera, its full functionality remains unclear. And even if the Humane’s camera turns out to be broadly controllable, there’s still the matter of figuring out where to clip it onto the wearer’s body for a proper photo/video angle. There’s no such design hurdle with smart glasses, which are fundamentally situated to point toward whatever you’re looking at.
Ergonomics aside, let’s put all this another way: If Ray-Ban had partnered with Apple instead of Facebook/Meta, Ray-Ban smart glasses would already be nearly ubiquitous around the world. The branding issue means that it just took Meta a little longer to start making headway into the market. The obvious ease of use and familiarity of glasses make smart glasses, for now, the clear vector for mobile AI assistants beyond the smartphone.
Of course, when Apple finally gets around to launching its own Hermès and Gucci smart glasses partnership, Ray-Bay and Meta will have a tough fight. The good news for Meta is that Apple has been so wrapped up in its VR/AR and iPhone pursuits, that it appears to be, for the first time in a long time, a bit behind the innovation curve when it comes to AI.
That won’t last for long. In the meantime, Meta has the opportunity to shed the creaky lingering image of its Facebook past and potentially become the face of mobile AI. The copycats are coming, now it’s on Zuckerberg to make sure they have trouble catching up.