Member

Meta’s smart glasses are doling out style advice

Vogue Business got early access to Meta’s updated AI-enabled smart glasses, which can react to what users see — including selfies.
Image may contain Clothing Pants Adult Person Accessories Glasses Footwear Shoe Belt and Plant
Photos: Maghan McDowell, using the Ray-Ban Meta glasses

This article on Ray-Ban Meta smart glasses is part of our Vogue Business membership package. To enjoy unlimited access to our weekly Technology Edit, which contains Member-only reporting and analysis and our NFT Tracker, sign up for membership here.

Would you trust Meta AI to style your outfit? As the parent company of Facebook, Instagram, Messenger and WhatsApp goes all in on generative artificial intelligence, it’s hoping that fashion advice will be one reason for wearing its AI assistant on your face.

Meta launched its latest generation of smart glasses, made in partnership with eyewear brand Ray-Ban, in September. The Ray-Ban Meta smart glasses connect to a smartphone app, and can answer queries using Meta AI, similar to how people interact with Amazon’s Alexa, Google’s assistant and with Apple’s Siri. The glasses can also capture images and videos, live stream to Instagram and act as a bluetooth speaker for music and phone calls.

Starting today, the glasses will also be able to ‘see’ and analyse the content that its cameras capture — including what you wear, for people in the US and UK. Called ‘Meta AI with Vision’, this means that people wearing the glasses will be able to look at a piece of clothing and ask Meta what goes with the piece, or look in the mirror and ask for feedback on their outfit. (The functionality is a multimodal version of Meta AI, which means that it can understand both images and text.)

In a blog post announcing the updates, Meta said they would be offering new frame shapes and colour variations, still in partnership with Ray-Ban parent company EssilorLuxottica.

Fashion and shopping is also seen as an industry ripe for mixed reality and smart eyewear use cases more broadly, with those including Gucci and Mytheresa being among the first to create immersive experiences for the Apple Vision Pro. Meta’s glasses can only provide audio content for now, with the potential to support augmented reality and on-glass AI in the future. But can they give good style advice? I decided to find out.

Testing the tech

It’s unclear where Meta AI gets its style advice, and the company didn’t respond to multiple requests for information on this training data. In autumn, the company said that broadly, Meta AI was trained, at least in part, on public posts across Instagram and Facebook. A query using Meta AI’s new desktop site said that Meta AI was trained on sources like web pages, online forums, product reviews and Wikipedia.

In 2019, Facebook (as it was called at the time) shared details of a project it had developed with university researchers to use AI to recommend how to make outfits more stylish. Called Fashion++, it could read outfit images and then make suggestions. Researchers trained the model on 10,000 user-submitted images from Chictopia — a now-defunct outfit-posting site founded in 2008 — and then asked people to rate its advice. Meta AI currently leans more heavily on describing what it sees, as opposed to offering in-depth styling advice.

The Ray-Ban Meta glasses include a leather charging case, and the frames themselves include speakers and two cameras.

Photo: Meta

The first time I tried it, I was wearing a blue blazer, a black T-shirt and denim shorts. And I asked Meta, “What goes with this blazer?” It answered, “The blazer looks great with the pair of distressed denim shorts and a white graphic tee for a casual, stylish look.”

The response was pretty simple, and didn’t offer much in the way of extra style advice; instead, it seemed to lightly validate what I was already wearing. But the fact that I could wear glasses, look in the mirror, and the AI could ‘see’ what I was wearing and offer any feedback at all, felt like a powerful moment. After years of reporting on and considering how smart glasses could be embedded into daily life, it felt like an important milestone.

The tech world has been salivating for more than a decade on the concept of smart, ‘normal-looking’ eyewear that can offer computing power on the world around you. While Google Glass and Snapchat’s Spectacles were among the top previous candidates, this felt like the most practical, useful and impressive iteration yet. It’s ‘just’ computer vision — the same tech that Pinterest uses to source similar images — but the execution, and the fact that it was a stylish-looking device I was wearing on my face, made for a powerful first try.

I think that’s what stands out with these glasses. Even without the (likely imminent) augmented reality features, the glasses offer a practical use for smart glasses that is particularly compelling for fashion. Already, I have enjoyed wearing the Ray-Ban Meta smart glasses to fashion week, where I can capture content in a more hands-free, less-intrusive way, compared to watching everything through a smartphone. While I’m used to covering and testing technologies during fashion weeks — such as NFTs, digital fashion, live streaming, social commerce — this is the first tech that the wider fashion community seems to actually lean in to, and ask to learn more about.

I’ve also become increasingly likely to capture and post more candid, personal content using the glasses, such as unboxing videos, because I can use both hands, so I’m curious whether influencers in particular will pick up on this.

The subjective nature of style

After the first test, I was curious to push the style advice a bit further. While attending a work conference wearing a pleated skirt, a striped top and a blazer, I asked Meta, “What goes with this outfit?” It responded, “The outfit appears to be a stylish and professional ensemble, well suited for a business or formal setting. It includes a black blazer with white stripes, a matching striped shirt, and a pink skirt with a floral pattern. A pair of heeled ankle boots complements the overall look.” Not perfect, but not bad. Since my shoes weren’t showing in the image, I see why it suggested footwear.

I noticed that my surroundings seemed to play a role in how Meta AI responds. Back at home, I wore something as simple as possible — a white T-shirt and the jeans — to see if Meta could push me into more stylish recommendations. When asked what went with that, Meta responded, “The white T-shirt and black framed mirror complement the outfit nicely.” I hadn’t exactly planned to wear the mirror for running errands.

Photos: Maghan McDowell, using the Ray-Ban Meta glasses

Similarly, when I wore a black tank and black jeans, and asked it what to wear, it said the all-black outfit looked great with the mirror’s black frame and the white walls, adding “a touch of elegance to the overall interior design”. In the Meta View app (the companion app for the glasses), I rated the response with a thumbs down, and when prompted, selected further feedback that said the advice wasn’t accurate.

Because Meta didn’t suggest anything to wear with the outfit when I asked an open-ended question, I got more specific by asking what jacket to wear. I also put on a pair of boots. “The outfit looks great on its own,” it said, “but if you’d like to add a jacket, consider a lightweight white or cream-coloured denim jacket to complement the overall look.” I added a cream blazer and asked, “Tell me if this outfit is better.” It replied: “The outfit looks great on you! The blazer adds a touch of sophistication while the ripped jeans give it a casual, effortless vibe.” Nice affirmation, but was it just telling me what I wanted to hear?

I turned up the heat, deliberately wearing pieces that didn’t mix, even on the most maximalist of days: a pink toile bomber jacket, a long navy floral skirt, a black tank and lilac slides. The Ray-Ban Metas said: “The outfit looks great together! The colourful floral skirt pairs well with the neutral-toned blazer and shoes.” Interesting. Maybe I wasn’t up close enough? I tried again. No dice; Meta still thought the clearly clashing look was “cohesive”.

Photos: Maghan McDowell, using the Ray-Ban Meta glasses

Next challenge: suitability for various occasions. Wearing the tee and jeans, I asked Meta if the outfit was good for work. “The outfit appears to be casual and may not be suitable for work, depending on the dress code of your workplace. It’s important to consider the industry, role and work environment when deciding what to wear to work.” This got a thumbs up.

Wearing a diaphanous cocktail dress, I asked if it was appropriate for a ski trip. Meta said that the lightweight, thin material didn’t seem suitable. What about for the Met Gala? “The outfit appears to be stylish and well put together, but it may not be the most appropriate choice for the Met Gala, which is known for its formal and extravagant fashion. Consider the theme of the event and whether your outfit aligns with it. Additionally, consider the dress code specified by the invitation or host, if any. Ultimately, choose an outfit that makes you feel confident and comfortable while respecting the occasion’s formality.” Apparently, the annual conversation around adherence to the Met Gala theme (or not) had made it into the model.

I was curious how it would handle shopping and brand-related queries, given they are the natural next step. Similar to what Pinterest has already done, it would make sense for Meta to ultimately sell ads that provide relevant, shoppable recommendations. Wearing the same dress, I asked Meta if I should buy it. Meta responded, “The dress looks great on you! The print and fit are flattering, and the colour is lovely. If it’s within your budget and makes you feel confident and comfortable, I think you should consider purchasing it.” I appreciated this nuanced advice, but I didn’t have utmost confidence in the tool’s assessment of style or fit. I asked who designed the dress, which was from Saloni; Meta said it was an “unknown fashion designer”. (In all fairness, it wasn’t a current season item.)

Although the humans I normally talk to at Meta were mum on the future plans for the glasses, my Meta virtual assistant gave me a valuable clue: “Hey Meta, look and tell me where I can buy a dress like this,” I said. “I can’t help with requests about product pricing or availability yet,” it responded. “But I’ll be able to soon.” Noted. Thumbs up.

Update: Article has been updated to include details on the timing of the multimodal AI update. 24 April, 2024

Sign up to receive the Vogue Business newsletter for the latest luxury news and insights, plus exclusive membership discounts.

Comments, questions or feedback? Email us at feedback@voguebusiness.com.