No matter the season, there's always a reason to support Marketplace. 💙 Give Now 🎁
What’s the future of retail shopping? Snap bets on virtual try-on tech.
Apr 21, 2023

What’s the future of retail shopping? Snap bets on virtual try-on tech.

HTML EMBED:
COPY
The company that gave us the Dog Ears and Rainbow Vomit filters is selling augmented-reality technology to businesses, enabling users to virtually try on clothes, makeup and jewelry.

Snapchat made its name with silly augmented-reality filters, or lenses, as it calls them. In recent years, it has expanded into shopping, enabling users to try on clothing, jewelry and makeup in the app.

The company, now called Snap, has begun selling this technology to other businesses. Snap announced this week that it’s pushing AR tools into the real world, bringing AR mirrors to some Men’s Wearhouse and Nike stores in the U.S.

A headshot of Snap's Carolina Arguelles Navas, smiling at the camera with a yellow background.
Carolina Arguelles Navas (Courtesy Snap)

​Marketplace’s Meghan McCarty Carino went to the company’s headquarters in Santa Monica, California, to try the tech out. There, Carolina Arguelles Navas, Snap’s head of AR product strategy, told her that try-on technology has really caught on with users.

Carolina Arguelles Navas: The idea of virtually trying on sunglasses and thinking, “Oh yeah, that makes sense. If I could try it before I buy it all from home, it makes sense.” We actually already have over 250 million people [who] have virtually tried on products 5 billion times on Snap in the past year. This isn’t the future of shopping, technically not even the present because that was the past. And so this is really about: How does the future industry evolve to really start to incorporate this type of technology because it’s helpful for people?

Meghan McCarty Carino: There are various different systems that different retailers are using to do virtual try-on. I’ve seen Walmart has its own version, where you can sort of create your own avatar or pick from a lineup of different models. There’s mirrors, enabled with different kinds of visualizations. What makes augmented reality and the mobile context particularly beneficial about using that technology for trying things on?

Arguelles Navas: I think, at its most basic, everyone has a smartphone in the palm of their hands. So being able to tap into a device you’re already using is hugely important. Being able to access these types of technologies and experiences with something you already have and even on the platforms or applications you already have downloaded to your phone is also very helpful, because shopping is also more than even just, “Let me get the confidence of ‘Do I like how it suits me?’ Also, let me ask my friends for their advice.” Having these types of experiences to virtually try on products and be able to very easily share that with friends, to ask for that advice, to make that decision. But there’s been a lot of challenges for people, and retailers in particular, to actually incorporate this technology. On one side, it’s because they oftentimes don’t have the existing assets to go do it or the skills to know how to build these types of experiences. And that’s where it’s really important that we aren’t just building technology and building a vision of the future, is that we’re actually investing in an entire industry that’s soon to support the adoption of this.

McCarty Carino: When it comes to try-on tech, there are kind of two difficult variables. There’s the body, the face, the complexion, all of the complexities of the human that’s trying it on. And then there are all the dimensions of the garment, I mean, can it approximate the actual fit of a garment?

Arguelles Navas: For apparel, we’ve been taking many different approaches to make sure that it’s an accurate reflection. We actually handle that with a separate solution, which we call Fit Finder. And what this does is it takes information about the garment and its exact dimensions at every single size. And it also combines that with information that you have provided, such as your height, weight, your preference — I like a loose style, I like a fitted style — because this is also …

McCarty Carino: And I imagine the fabric type also affects those things. There’s a whole lot of variables.

Arguelles Navas: Exactly. It’s taking all of that into account, and then it is going to give you a size recommendation. How does it do this? Well, it actually says: For all this information we have about people who have a similar body-weight distribution and have a different and a similar fit preference, what have they purchased? More importantly, what did they return? And based on that information, what size should you buy? And so today, what we’re doing is we’re really using the camera to visualize: What does the item look like? Do I like the style of the item? And then, we kind of add this fit and AI solution to say, OK, now what size should I buy? Now more and more of these things are coming together into the same experience, but for clothing in particular right now, we really want to make sure it’s very accurate. And so we’re using both technologies to really bring that together.

Arguelles Navas guided me to a room that sort of looked like a photo booth. The entrance was in the shape of the Snapchat ghost logo. It was pretty bright in there, and there was a big, vertical screen, sort of like a giant phone, where you can play with goofy Snapchat filters, like the viral “shook filter” that gives you big, wild eyes or one that puts you in a giant, puffy jacket like that AI Pope picture.

It also lets you virtually try on real clothes and accessories from different brands. As Arguelles Navas tried these filters on me, she put some Zenni eyewear on me. Then, she virtually put a Prada bag on my arm and changed its colors on the AR mirror. Unfortunately, that Prada bag remained virtual for me. I didn’t actually buy it.

All this technology is pretty cool to play with as a user. And it’s increasingly available outside the Snapchat app through the company’s AR Lens Studio, which brings the AR experience to other businesses.

Brian Cavanaugh is the director of project management and business development at Fishermen Labs, which is a partner of Snapchat’s AR Lens Studio, where Cavanaugh helps develop virtual try-on features for brands like Louis Vuitton and Signet Jewelers.

He said retailers want in on this technology, but they often don’t know where to start. That’s where he comes in.

A headshot of Fishermen Labs' Brian Cavanaugh smiling at the camera wearing a gray collard shirt with bushes in the background.
Brian Cavanaugh (Courtesy Fishermen Labs)

Brian Cavanaugh: We work with the client team. We figure out what their goals are, figuring out what products they want to feature. And then we’re off to the races. And from there, we’re able to leverage some of the technology that’s built into the Lens Studio platform that makes it extremely easy to do. There’s a lot of complexities that can be built into it and customizations that can be done. But out of the box, Lens Studio and Snap really provide templates and frameworks to build on top of that, make all of the trial aspects that the brand wants to target extremely easy to do. So we’re just kind of a plug-and-play at that point. So for us, we’re kind of an end-to-end solution from strategy to actual design and development of that, all the way through to when that user gets to see it live in their hands through the camera.

Cavanaugh added that as AR technology continues to improve, he expects to see this being used outside the entertainment and e-commerce spaces.

Cavanaugh: Now that we can recognize surfaces, vertical surfaces, rooms, environments, objects, all of those different things, and they’re easy to plug into, especially using LIDAR [light detection and ranging] technology, you can get real-time measurements of some of these spaces. And, again, you can drop or apply anything that you want. So you can do, like, floor segmentation to see: I’m doing a kitchen renovation, figure out what we’re going to switch the floor to. What color am I going to paint the walls? And you can combine all that into a very real tool to figure out what action I want to take next as a consumer. With AR, you’re not just actually watching a story unfold. Now you get to be a part of the story. And to me, it’s the fact that now I can immerse myself into this world that I have never been able to see or even fathom before. And now I get to live this story. That’s really cool. And so I think, again, combining all of these new technologies to get bigger, faster and more readily available, we’re going to come up with some really cool experiences.

But with the augmented reality tech used for all this come privacy concerns. To make an accurate digital overlay, these tools often scan users’ appearance: head shape, facial features. Things we can’t really change if there’s a security breach. So I asked Arguelles Navas what happens to that data.

Arguelles Navas: Augmented reality, and using the camera, doesn’t have to actually be something that’s storing biometric data, that makes you personally identifiable. And that’s the approach that we’ve taken, which is to make sure that, yes, the camera can in the instant track that’s a face and eyes, but you cannot track “that is that person’s face.” None of that information is being stored and used or can be used for identifiable information. And so this becomes a topic where, especially if you’re a company who wants to start to invest in this space or you’re looking for a vendor for how to use it, you really do need to ask questions, because depending on another vendor or another solution out there, there might be different ways that they’re tracking this information that couldn’t necessarily be as privacy safe as it should be. But for us at Snap, it’s a priority because we agree it’s really important. And that obfuscation of identifiable information is really important to protect the future and the safety of how this technology is used.

McCarty Carino: So it’s being stored locally and not going into a database somewhere?

Arguelles Navas: Nothing’s being stored. The camera can identify that’s just a face in real time to just place an item, but nothing’s being stored in terms of “it’s this face” or “this is the image of the face” or “I’m capturing an image of what’s on screen.” All we can come back to to say is this AR experience went live and a face was identified within it because it was activated. But no content was actually captured, stored, and then again, used for kind of, any of that identifiable information.

McCarty Carino: So beyond try-on tech, where do you see AR going in the future?

Arguelles Navas: I mean, we really believe AR is going to disrupt and improve almost every major industry. You have this integration of AR in areas like entertainment and live events. We also see an incredible opportunity within the fitness space. You have leading companies like Nike, who does a lot more than sell products. They really have a mission about movement and really getting the world moving. And they’re using these AR experiences to actually help people move in a really fun way. It’s almost like a game, but it’s actually an active game. It has you stand up in order to play this game. You have to jump. You have to do certain arm movements or exercises. It really truly is that expansive of what it can do.

More on this

Here is more on Snap’s latest plans for AR and AI.

This week, Snapchat’s artificial intelligence chatbot feature known as My AI became available to all users. At Snap’s annual Partner Summit this week, the company said it can suggest filters that would work best for users’ images, recommend destinations on maps or generate images from prompts.

And more on the data privacy issues we talked about with Arguelles Navas. Last year, Snap faced a class-action lawsuit in Illinois, the only state with a biometric data privacy law. The company agreed to settle for $35 million, though it maintains it didn’t violate the law. Snap also added in-app consent notices for Illinois users. Last year on the show, we talked about that Illinois law, called the Biometric Information Privacy Act, as well as a lawsuit brought by truck drivers who said their fingerprints were collected in the workplace without their consent.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer