3D Fashion Deep Dive 07: Rethinking Virtual Try-On
Why Real-Time Fashion Is Bigger Than the Fitting Room
I hope you all had a wonderful holiday but it is time to get back in it!
3D Fashion Deep Dive…Let’s gooooooo!!!!
We started with the Start Here guide to frame why 3D fashion matters right now. Then we went historical, looking at how the industry moved from pencils and physical samples into digital patterns and simulation. From there, we broke down the modern software stack, talked to people doing this work day to day, dug into draping and cloth simulation, explored where AI actually fits, and most recently looked at how personalization is turning fashion into a systems problem instead of a static design problem.
All of that groundwork leads us here.
Because once you understand the tools, workflows, and constraints under which fashion teams work combined with your knowledge from other industries, one idea always seems to pop up.
Virtual try-on.
Why Virtual Try-On Felt Like the Holy Grail
When I first started digging into 3D fashion, virtual try-on felt like the obvious endgame.
Brands were already designing garments in 3D using tools like CLO and Browzwear. I was talking to companies where reducing returns was a massive priority. Customer satisfaction, fit confidence, fewer refunds…all of it tied directly to better representation of products online.
So the logic felt airtight.
If garments already exist as accurate 3D assets, why not put them directly onto the customer? Let people see the clothing on themselves instead of on a model. Replace static photos with something interactive and personal. That felt like the breakthrough.
We’d already seen hints of this working elsewhere. Eyewear had made real progress. Snapchat filters showed that people were comfortable seeing digital objects layered onto their bodies. Even lawyers were turning themselves into cats!
It felt like a classic “only a matter of time” problem. And that was my mistake.
A Shift in Perspective
After spending more time inside real fashion workflows and getting much deeper into real-time rendering, VR systems, and what it actually takes to make interactive 3D work at scale, I’ve changed my mind.
I don’t think virtual try-on is the end-all, be-all application of 3D in fashion. At least not in the way most people imagine it.
That doesn’t mean it’s useless. It doesn’t mean brands should abandon it. But it does mean that a lot of expectations around virtual try-on are misaligned with both the technology and the business realities of fashion.
And importantly, the biggest limitations aren’t just about waiting for faster GPUs or better shaders. Some of the friction is structural.
Why I’ve Rethought Virtual Try-On
The first reason is simply the reality of real-time performance.
Virtual try-on almost always lives on a mobile device. A phone. Maybe a tablet. Occasionally a headset. And while those devices are getting better every year, they are still incredibly constrained compared to offline rendering pipelines.
Fashion garments are some of the hardest things to render convincingly. You’re dealing with layered fabrics, translucency, subsurface scattering, complex folds, micro-wrinkles, stitching, and materials that behave very differently depending on lighting and motion. You can do complex 3D on a phone, but doing it well, accurately, and in real time is a very different problem.
The gap between “this looks okay” and “I trust this representation enough to make a purchase decision” is still massive.
I spent time looking at high-end systems too. Even something like Apple Vision Pro, which is about as powerful as portable real-time hardware gets right now, struggles to deliver truly photoreal cloth behavior in a way that feels effortless and immediate. And if that level of hardware can’t quite get us there, it’s hard to imagine mainstream consumers doing it on a phone without compromise.
The second reason is more subtle, but arguably more important: returns aren’t purely a visualization problem.
Someone in the fashion space put this very bluntly to me. As long as free returns exist (and especially with companies like Amazon setting consumer expectations), there’s a hard ceiling on how much return rates can be reduced. A certain percentage of shoppers will always overbuy, knowing they can send things back. No amount of realism fully solves that behavior.
Virtual try-on can help. It can reduce uncertainty. But it doesn’t magically erase the economics of modern e-commerce.
And then there’s the biggest one, the one we don’t talk about enough.
Virtual try-on doesn’t solve the tactile problem.
You can show how a garment fits. You can show how it moves. You can show how it looks under different lighting conditions. But you cannot show how it feels. Fabric hand, softness, stiffness, weight, stretch…those are incredibly hard to communicate visually, even with perfect simulation.
I can tell you all day that a heather gray t-shirt is softer than a pure white one. I can show you photos. I can describe it in words. But until you actually touch it, you don’t know it. Even though we all know heather gray t-shirts are the softest!
Fashion is deeply physical in a way that 3D still struggles to replace.
Taken together, these issues forced me to rethink what virtual try-on is actually good at and what it isn’t.
Where Virtual Try-On Does Make Sense
This is where the conversation gets more nuanced.
I don’t think virtual try-on disappears. I think it finds more specific, constrained roles.
Eyewear makes sense. Accessories make sense. Certain footwear use cases make sense. Styling and outfit assembly makes sense, especially when the goal is inspiration rather than precision.
I’ve also seen interesting approaches where virtual try-on isn’t about perfect realism, but about helping someone assemble a look. “I have a wedding. I have a budget. Build me an outfit.” That’s less about replacing a fitting room and more about guiding decision-making.
Those use cases are valuable. They just aren’t the silver bullet we once thought they were.
Re-Framing Real-Time in Fashion
Here’s the realization I landed on after all of this:
The problem isn’t real-time.
The problem is that we’ve been overly fixated on virtual try-on as the only real-time outcome that matters.
When we talk about real-time in fashion, we almost always jump straight to the customer-facing mirror moment. Me, my phone, a garment overlaid on my body. Does it fit? Does it look right? Will I buy it?
But real-time fashion experiences don’t have to be limited to that use case at all.
In fact, some of the most compelling real-time applications I’ve seen have nothing to do with replacing the fitting room. They’re about interaction, presence, identity, and expression. They’re about garments as digital objects that can exist, move, and respond inside real-time environments.
Once you let go of the idea that real-time must equal virtual try-on, the space opens up dramatically.
Where This Starts to Get Really Interesting
This is where my thinking has shifted the most.
If you look outside of traditional fashion tech and start looking at adjacent industries, there’s one place where real-time digital identity is already deeply understood, widely adopted, and emotionally resonant:
Games.
In gaming, people already care deeply about digital clothing. They spend real money on skins, outfits, accessories, and cosmetics. They use clothing to signal status, taste, affiliation, and personality. They accept stylization. They embrace performance. And they interact with garments in real-time environments constantly.
That overlap between fashion and gaming is not theoretical. It’s already happening..and we will dive in that in next week’s Deep Dive!
The 3D Artist Community
We are pumped to have Javier Perez joining us this week for an AMA!
Javier Pérez is a Lead Material Artist at PlayStation Studios Visual Arts, where he spends his days figuring out how to make game worlds look believable up close and still hold together at scale. He’s been working in games for well over a decade, with a deep focus on materials, surfaces, and procedural workflows, especially in Substance Designer.
If you’ve ever wondered how AAA teams actually build materials that survive production constraints, engine limits, and art direction changes, this is very much his lane. Beyond studio work, Javier has a long track record of teaching, sharing breakdowns, and helping other artists level up their material workflows.
For the 3DAC AMA, he brings a very grounded, no-BS perspective on what really works in production, what doesn’t, and how to think about materials as systems rather than one-off assets.
3D Merch is here and we have a new hoodie!
3D Tutorial
3D Job Spreadsheet
Link to Google Doc With A TON of Jobs in Animation (not operated by me)
Hello! Michael Tanzillo here. I am the Head of Technical Artists with the Substance 3D team at Adobe. Previously, I was a Senior Artist on animated films at Blue Sky Studios/Disney with credits including three Ice Age movies, two Rios, Peanuts, Ferdinand, Spies in Disguise, and Epic.
In addition to his work as an artist, I am the Co-Author of the book Lighting for Animation: The Visual Art of Storytelling and the Co-Founder of The Academy of Animated Art, an online school that has helped hundreds of artists around the world begin careers in Animation, Visual Effects, and Digital Imaging. I also created The 3D Artist Community on Skool and this newsletter.
www.michaeltanzillo.com
Free 3D Tutorials on the Michael Tanzillo YouTube Channel
Thanks for reading The 3D Artist! Subscribe for free to receive new posts and support my work. All views and opinions are my own!














