LOOPED IN with Carl Warkentin

The Future of Product Creation with Mode Maison Founder & CEO Steven Gay

Carl Warkentin Season 1 Episode 18

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 39:51

Imagine typing, “Warmest low‑profile jacket for Siberia, size M, modern look,” and seconds later seeing a faithful simulation on your avatar—plus the exact factory files needed to make it. That’s the leap from pretty pixels to physics‑native product creation we dig into with Steven, founder and CEO of Mode Maison Home.

We trace Steven’s path from Ralph Lauren’s Purple Label to building TMAC, a material scanning system that captures not only how fabrics look but how they behave—torsion, tension, compression, and thermal performance. Those measurements fuel a physical world model that knows where a sneaker will bend, how a linen will drape, and when a suede will fail, producing simulations you can trust and manufacturing files you can ship. The result: fewer samples, faster cycles, and designs that move from intent to production without the usual guesswork.

This shift unlocks on‑demand, onshore microfactories and a new kind of commerce where the minimum order quantity is one. We talk hyper‑personalization at scale—garments tailored to your climate, taste, and movement—while cutting overproduction, warehousing, and global shipping. We also explore circularity with digital product passports and embedded IDs, making resale, repair, and recycling smarter and simpler. Along the way, we weigh real challenges: today’s siloed toolchains, the risk of brand sameness in “ghost manufacturing,” and how brands can stay distinct by treating AI as a physics‑aware co‑designer rather than a pixel pusher.

Looking ahead, Steven outlines a five‑year horizon where brands specify outcomes in plain language and receive validated designs plus tech packs in moments, and a ten‑year horizon where brands act as creative directors while systems generate products within their guardrails. If you care about sustainable fashion, advanced manufacturing, or AI that grasps the real world, this conversation connects the dots from factory floor data to build‑ready files. Enjoy the episode, then share it with a friend, subscribe, and leave a review telling us the first product you’d generate.

Contact Us

This is interactive content - send us your questions to the guests and we record another session just focusing on your questions!

You have suggestions for new guests or want to sponsor the show?

Thanks for listening and keep podcasting!

Framing The Vision Of Physical AI

SPEAKER_01

Steven, welcome to my show. You're the founder and CEO of Mode Maison Home. I'm glad to have you here.

SPEAKER_03

Yeah, thanks so much for having me, Carl. I'm excited to be here.

SPEAKER_01

When one goes to your website, it's still very incognito or under stealth mode, if you will. So it's very mystical and it's very exciting. We had some calls before. Let me try to describe what you're doing, and then I'll let you take it from there and uh help us understand what you're doing and how you're changing the world for the better with this. So my understanding, Mop Mizon Home, is like build a digital infrastructure that turns physical products into physically accurate, reusable digital assets so that AI can design, simulate, and generate manufacturable products. So basically, it's a layer deeper than a digital twin and one layer, you know, before manufacturing automation.

Beyond Digital Twins To Manufacturable Assets

SPEAKER_03

That's great. No, I mean like our kind of our view of the future is pretty simple. So we think the future of product creation will be digital and we'll leverage AI full stop. And so effectively, in order to kind of like realize this future vision, um, we take a physics-based approach to digital product creation. So what I mean by that is today you might have a digital twin of a sofa, for instance, but it really isn't connected to any of the manufacturing processes, um, which is really kind of when we talk to brands and manufacturers, that's really where a lot of the uh the value creation is for them. So it's one thing to show kind of like an image of a product. Um it's another thing to be able to leverage AI to be able to actually manufacture that product at the end of the day. So if I wanted to create a sneaker, for instance, and I'm Nike, um, you know, it's one thing I can I can integrate my sneaker into images now with nano banana and all these other really cool tools. But kind of it just these these AI models today, they stop at just kind of mashing two depixes together. What we're doing is kind of a layer deeper. We're integrating physics to create these physical world models um where in effect you could essentially think of it like Chat GPT. So I could be Nike, I could say, hey, I want to create an ultra marathon running sneaker where uh somebody can run on average over 50 miles. That's definitely not for me, but that's for somebody. Um, and where essentially that that uh uh the model, what we call a la mode, will essentially be able to generate and create that shoe, both the simulated version, so it's a physically correct simulation, but also the manufacturing files. And so um, you know, that that can cut a lot of the lead times required to actually create that product in real world, and then the iteration cycles down uh uh quite substantially.

SPEAKER_01

There's a lot of digital product creation tools out there right now, like in fashion, obviously, clo and style 3D, browseware, and you have arcade AI. How do you differ?

Why AI Needs Physics And Data

SPEAKER_03

Yeah, so we're we're AI native, um, whereas those are true like in Clow or Browser's case. Really great companies, really great products, but they um are more kind of like classical algorithms um driven. So they're physics engines, um, really great at what they do. But when we're looking at you know um being able to digitize um the precise drape of a specific linen um or a different structure uh material, you you need, at least in our estimation, it requires a more AI or data-driven approach just to be able to generalize to broader categories. And so that's kind of where we differ. Um we differ on the, I guess just the approach level, but also our company's basically revolves around data collection and creation, um around the physical data creation.

SPEAKER_01

Before we go into the details on how you do it and how you approach this, how did you get to to start that company and when?

SPEAKER_03

Yeah, so uh I come from, I guess, a bit of a non-traditional background because I was pulled out of school a little early um to go become the youngest uh ever concept designer for Ralph Lauren Purple Label. Um so I was designing kind of like the purple label concepts. Um incredible experience. I loved it. Um but kind of when I was there, I just started to see a lot of the inefficiencies. So within fashion, um, one collection can take you know 12 months to create from ideation of the product all the way to final delivery. And I just thought that like, you know, surely there's got to be a more efficient or optimal way to do this. Um and so left Ralph uh to start movies on. That was about six years ago. Um, we started by creating uh scanning devices that we'd since placed on factory floors to basically enable manufacturing brands to create digital materials and physically accurate digital materials at scale. And so kind of like building on top of that, we actually use a lot of that data to train these uh these physical world models.

SPEAKER_01

Okay, tell me a bit more about your initial product. Who was your typical customer and and how did it help them?

Material Scanning And TMAC Explained

Cutting Samples With True Simulation

SPEAKER_03

Yeah, so um it it's it's funny because when we talk to really anybody, they always ask kind of what industry we're in. And I just say, well, I'll say a lot of things, but um it's really hard because if you can digitize a SOFI, you can really digitize anything. And so if you have materials, if your product is made of materials, you're our customer, essentially. For that one product, it was called the TMAC, the Total Material Appearance Capture System. And so we work with anybody from you know the largest blinds company in the world to the automotive space to the aerospace uh uh side of things. So it really is kind of spans the gamut from um oh, just across a whole lot of industries. But what we do for them, the real value add uh at the moment, is that they can essentially plop these big devices on their factory floors, um, bring a textile, for instance, into the device, press the big green button as I like to call it, and it automatically digitizes that material into a what's called P VR material, so uh uh physical-based rendering material that then is used to um either render that product or design uh that product, whether it's in Clot3D, as you've kind of mentioned, or other um like Unreal Engine. Um but we again kind of our future state and where where we've always kind of um been keen on is being able to leverage that data for something uh a bit bigger, just because the the processes within the digital product creation pipeline at the moment, they're really not scalable. And I think brands are starting to realize that. And so that's where these kind of more traditional pipelines, they've been great for uh for ideating in some sense. But when it comes to actually scaling that that that uh digital product creation across the entire org, where they can actually truly make you know a car inside of a computer and then simulate that car and then and then make you know an optimal car for whatever given scenario, or let's say again, a sneaker. That is still very physical. Right now, it requires a whole lot of iterations, a whole lot of samples, a whole lot of testing. Um, but if we can make a true physical kind of like simulation, then uh you can cut out a lot of I I would actually probably say over 80% of the kind of what I call the physical bulk um internally within companies and brands.

SPEAKER_01

So let's take the t-shirt, right? I could design a t-shirt in clo, right? And and see how it would feel depending on the fabric I use, et cetera, and the size, um, where it stretches, et cetera. Like I I could already have that kind of visualization and testing.

Hyper‑Personalization And Generative Commerce

SPEAKER_03

Where is it different? Yeah, so so it it requires a lot of manual processes right now. So imagine again, I just give an image of of a really cool 1920s Canadian Spireman jacket, and I say, hey, I love this image. I love the structure of the shoulders, I love everything about the jacket, but I'm 5'9, I'm a little bit short, and uh and basically I want it to fit my body perfectly. Um and I also want it to be a little bit more modern take on that. Then you press, you know, enter, and voila, you have the simulated version of that actual garment, plus its tech pack, so it's it's actual manufacturing files. Um and then you can do all the iterations, you know, therefore, um based on kind of what you want. And kind of over time, we don't think that you know, in the next two, two and a half years um that it will be 100%. So meaning like there'll still be some hand holding um uh by the designer or the design team. But you know, going forward in the next, I would say, four years, I think that we can can automate uh a lot of the actual designing process itself.

SPEAKER_01

And that would obviously lead to less uh iterations with um samples, etc. Because I do that entirely digital.

SPEAKER_03

Yeah, I mean if you kind of walk it forward a little bit even you know more, you start to get into the question of could we create uh uh a different model of commerce where it's actually what I call like a generative commerce or generative manufacturing. So, Carl, if I know you, you know, like this certain shirt, or if I know your kind of like style and your aesthetic and your taste and your size and your kind of previous purchase history, what if I could basically just you know generate a product for you? It's absolutely photoreal, it's all physically based, and then you either like it or you don't. Let's say you do like it, you press buy now. Well, then basically it's never been made before. I just take that manufacturing file because now I have it, and I send it down the road to you know a 3D print shop essentially, or like, you know, any local kind of tailor manufacturing shop. I think that's really where where the world of commerce and uh retail is gonna go. It's gonna be hyper-personalized around um individual tastes, needs, wants, just you know, functionalities. Um and again, when I was at Ralph, we were, you know, designing products based around what's the what's the best average product for the most people, right? I think that that kind of feature um will go away. Like I don't think we need to design the best average product. I think we can design the absolute best product, the most optimal product for each and every person based on their needs at that time.

Circularity And On‑Demand Microfactories

SPEAKER_01

So theoretically, you suggest the user or the consumer what he could or should wear, but the user could also himself design a product very easily with the help of AI and start producing it.

SPEAKER_03

Yeah, definitely. And again, I don't want to uh kind of get ahead of myself because I think we're still quite a ways away from that. Um and I and I it even in that future, I still think brands play a huge part because I want to buy Ralph Warren because I like Ralph Lauren. I like kind of everything they stand for and I like their aesthetic. I just think kind of brand is gonna play a different role in the future where they're gonna kind of set the parameters or the guidelines, if you will, um for what their products kind of look, the feel, kind of the the whole ethos behind their their brand. And they're gonna be truly what they are. They're gonna be a brand. They're not gonna be manufacturer at that point. Um but I'm still gonna buy Ralph Lauren. Um so I think brand will always be there. I'm just thinking that, you know, this this future state is gonna be a lot more optimized and a lot more efficient and allow for better products at the end of the day, whether it's a sneaker, whether it's a garment, or you know, whether it's just really anything, honestly.

Feeding AI With Physical World Data

SPEAKER_01

So everybody who listens to that podcast knows that I'm like we're covering here the entire value chain of the textile industry, and we we talked a lot about recycling and sorting for recycling, etc. Recommerce. Since a year, more or less, I'm I'm really excited about the opportunities that onshore on-demand production bring. That you can have a global infrastructure of microfactories, and within hours, maximum one or two days, you can produce almost any kind of garment. And there are other factories for glasses, for footwear, um, etc. So let's assume you can you know produce a lot of garments in these microfactories all over the world. That it would imply that you can offer hyper customized products in terms of size, style, etc., um, to every single person because the MOQs, so the minimum order quantity is going down from hundreds and thousands in in Asia somewhere to maybe one. Um and thinking that through, that means obviously the whole the whole infrastructure of the fashion industry is gonna change, right? How do you see that?

Brands, Uniqueness, And The Risk Of Sameness

Linking Design Files To Automated Production

SPEAKER_03

I totally agree. I mean, I think a circular economy is is is is better for for everybody. I mean, the the amount of waste that is generated by um I mean it's it's kind of a trope at this at this point that that that the fashion industry uh accounts for a lot of the waste. Um but it but it's true, honestly. Um so I think that we kind of have to change the model, whether it's uh a more kind of like generative model, like I was talking about, or even in the interim, more kind of like the circular economy where we're able to leverage and kind of like you utilize um uh textiles that maybe somebody else didn't need or want um, where there was recycled back into, I guess, the flow of things. Um I still think that that in that in that kind of future state, you still need um to be able to create the product um in kind of a digital space first. Um and so again, I think it does rely on physical AI, which uh right now a lot of what I talk about is is that that there really isn't any physical data at the moment. And that's kind of what our main mission is around is is in setting up these processes and these workflows to be able to create and collect that physical data. Um, because again, as as I guess everybody knows, it's we're kind of tapped out on web scale data. So that's you know, text and that's pixels essentially. But when it comes to actually integrating AI into the our physical world and our physical needs uh being products and garments and things of that nature, um, AI is just really unintelligent at the moment. It just really doesn't understand anything physical. And and that's why you see it's a big it's a big challenge in the robotic space, but it's also a big challenge in just creating products.

SPEAKER_01

So you say we need to feed the AI with the physical data, and that is basically what you're doing.

SPEAKER_03

We need to create a model that understands physics much better because again, how can it understand how to create a shield if it doesn't understand how the drape of that material or that like this this one suede, for instance, you can't stick this uh this thread through because it would just fall apart. I think there's just so many things that that we're I feel like we're almost in maybe the first inning of AI. Um maybe I I think that we've only seen like a the that we're just at the very, very infancy. Like right now, we're just just starting out. But I think really kind of where the the real prize is, and again, I'm biased here, but I really do think it's in physical AI. When we can start tying AI to kind of the physical world and our physical needs, it's great to yeah.

SPEAKER_01

Where where does it lead in terms of the industry and the consumption?

SPEAKER_00

Like the what are the consumer interests? Are they still you you said they're still gonna buy brands, but now theoretically everybody can wear a totally unique garment. How does it work together?

From Intent To Manufacturing With Less Friction

SPEAKER_03

Yeah, and I think it's I think it's I mean, I think it's better. I think it just provide it it produces better products. Um, I think it eliminates a lot of the waste. Um and you know, the death of retail, it's always been said, but the death of retail is just stock, you know, and just kind of things that are just left um on the sale rack. And so you can kind of eliminate a lot of that um by producing things for each and every person individually. Again, I I want to caution that we're I'm not saying that's gonna happen tomorrow, but as we start to integrate physical data more into these these AI models, that is a true capability um that will be realized um you know, I think pretty soon, soon thereafter.

Five‑Year And Ten‑Year Product Creation Futures

SPEAKER_01

Um I'm trying to picture something, so with that opportunity of onshore on-demand production, microfactories, minimum order quantities of one. It allow like what are the biggest benefits? Uh near shoring means and low moqs means no more overproduction, which is I think around roughly 30%, um, that you can save, no more global shipping, no more warehousing needed, so you can build a brand without any in inventory.

SPEAKER_03

Uh you can means better products too. I think that's that's a thing that I always um like to really nail in on because if you can basically create a very intelligent model that understands both the material level, so it understands the compression of a material, for instance, the torsion, the tension, also understands performance metrics of these things. It understands kind of the thermal insulation of a different fabric, for instance. Um, it understands shape, it understands how that shape deforms. Um, and then you kind of give it your your requirements. Again, I always use the example of the ultra runner, ultra marathon runner, but but a functionality doesn't just have to mean um like comfort. It can also mean look. I mean looks a functionality. I want the prettiest, you know, XYZ. I want the I want the most comfortable XYZ. I want the, you know, so functionality can can kind of be around a few different categories. That's actually more like infinite categories. I want the warmest winter jacket um with the lowest profile. Uh so again, it it's not just, you know, that's where a lot of these world models like uh Genie and and things, they they they're only really trained on video data, which definitely helps a lot. But um again, it's really surface level data uh that that all these models are training on. When you start to kind of really feed it with a lot of that structured um mechanical kind of manufacturing data is really, I think we're we're uh you know the sky's the limit. And I don't think that it would be stupid for me to kind of sit here and say, I I have, you know, I know exactly what all we can do with it. I'm I'm very excited by what we can do with it. But um just like people come up with the crazy things that, you know, ChatPT can do every day, um, I I think that that's gonna happen but ten or a hundredfold um with with physical AI.

Avatars, Fit, And Digital Product Passports

SPEAKER_01

Right. And you're making a good point. I'm thinking about this a lot lately, and I believe that this will change how brands operate. I think we will have an increase of small brands, micro brands, influencer brands, creator brands, right? They can all now design their own clothes and sell them without any inventory, without any design knowledge, without um, you know, any access or any network in the industry when it comes to production, warehousing, shipping. You can literally sell your products without having any inventory. But ultimately, as a consumer, you can design your own products and then you know let them produce. Like you can, I like to have that comparison to Vista print, right? Like you can design your own business cards and it maybe only 200 business cards, and like for$10 you you print your own business card somewhere. That would have been not possible many years ago. But now, if you have a centralized system and a design tool that allows you to do so and finds the right um you know printer for you, then um all of these things are possible. And this is now where the fashion industry is going towards as well. And I do see already a lot of startups that are going into that design space. Do you see that as a threat? Is that exactly what you're doing, or do you have that competitive advantage of having these real physical data of the products?

Who Buys: Brands, Mills, And Manufacturers

Traction, Team, And Fundraising Update

SPEAKER_03

So I see a lot of the startups in the design space kind of more or less pushing pixels for to put it bluntly. Um again, they're doing it really intelligently and smartly. But I think again, ultimately, we're trying to solve the hardest problem, which is closing the loop in the physical production side of things. It's one thing to make a pretty garment. Um, it's another, like you know, in an image of a pretty garment, it's another thing entirely to 100% know that you can actually make that in the physical world and then also have the manufacturing files uh to be able to hand off the manufacturer to make. Right now, that's an absolute disconnect between what's what's what's capable from the smartest frontier models to um to kind of where we need to be. And that's that's that's where we're kind of playing is in closing that that loop. So I actually see them as beneficial and partners. Um, because at the end of the day, they're gonna need and want to rely on, you know, if we can have the most accurate simulations possible, um, like for instance, if you take Nike sneaker, for instance, and we can know exactly how the bend of the sneaker uh bends um with zero hallucinations, because we know kind of the the boundaries, but uh uh the boundary cases for how far that shoe can like, you know, how much torsion you can put on that shoe. And then we can integrate that into an image. Um we still need their imagery capabilities. So um we're working on the simulation in the manufacturing level, and they're really working on more the uh the prettiness uh in in in 2D pixel level. Um but one thing you just kind of mentioned that that I wanted to highlight was um this doesn't come without you know potential kind of pitfalls. Um just because I look at it it it this kind of this future state that that we're working towards um analogous uh to almost like a ghost ghost kitchen concept um where you can create and you know uh basically anything for anybody at any time and then kind of make it with these ghost kitchens, so these ghost manufacturers essentially. But what what has kind of happened with with ghost kitchens is all the food starts to kind of taste the same, right? Because they only have so many ingredients. So I, you know, as as a brand guy, I love love brands. I came from from a great brand. Um I I I do see that as potentially an issue that we'd have to kind of overcome. Because if brands kind of start to all become kind of taupe, that's not that's not great. We want to basically enable brands to um kind of push the boundaries in in even further and more kind of creative ways, not not kind of all blending together.

Closing Thoughts And Next Steps

SPEAKER_01

Right. As a I'm a I'm a chairman of Rodinia Generation. It's one of these microfactory concepts, maybe the most advanced one. And they basically take a DXF file that you can produce on Clow. They could tell me their what what are they able to produce in terms of garments, fabrics, style, sizes, etc. And with that information, I could feed a clo or another design tool, and then I can design kind of anything, right? And immediately would that be already the first step of integrating design and manufacturing? Or is there still something missing? Because I see a lot of like that would be, I think, the dream case, but a lot of microfactory concepts are not connected to the design tools yet, and the design tools are not yet connected to manufacturing.

SPEAKER_03

Exactly. Yeah, so you're saying that that this file format goes into clo or how does it how does it do that actually?

SPEAKER_01

Like um the machine of the O factory from Rodinia that can read a DXF file. And when you uh load it in there, then basically it can produce automatically. Almost no more human interaction needed. So I could basically, as a consumer designer, create a DXF file with Clow, or there are other tools now that you know Clow is for experts, but there are like other tools now through AI and and one in Australia called Fabra who basically make design a bit more accessible and easier, but it still will have a tech pack similar to a DXF file that I could then just send over to an O factory that can produce it for me. As long as the designer or the design tool knows which capacity, like what capacities an O factory have, for example. Is there still something missing, or or would that be the ideal scenario?

SPEAKER_03

That would definitely be a, you know, that's definitely going the right direction. I still think there's a lot of overhead um in terms of kind of the skill lift, um, both from a manufacturing perspective, but also from the designer perspective. So, Ralph, you know, my design friends were not the most uh tech savvy, uh to put it nicely. Um and so Clow or Browser weren't necessarily that easy for them. You kind of needed a technical 3D uh team for that. But then to my 3D or technical friends, they maybe weren't the most creative, or they won't weren't didn't have the best eye. Um, and so you kind of need a designer for that. And then you start throwing in the manufacturer as well, and you need a whole different kind of skill set and understanding, and that just muddies the waters a little bit further. And so you have this kind of like very, very, very muddy water. Um, and really nobody knows that you need kind of like you need all those kind of understandings uh to be able to do it in a lot more efficient way. Um, and so I think it's definitely a good step in the right direction, but it still requires a whole lot of what I call like the physical bulk or just the the bulk of uh a lot of human iterations. Um and my question would always be does that actually eliminate any of the sampling or iterations? Um I would say yes, yeah, definitely. Um than if you didn't have that capability, but at the end of the day, you still requires a whole lot of disparate expertises that don't really talk well together and don't really understand the others that great. And so it's a better, it's better, but it's I still think we're still very, very far away. Away from just being able to, you know, go from intent, design intent, um, to manufacturing, which is kind of what we're after.

SPEAKER_01

So take me to the world where you will be in five or ten years. How do you imagine it when your solution is fully deployed?

SPEAKER_03

I think five years and ten years is a very big difference, but I think in five years, um, I think we'll be in in kind of a a place where your Ralph Lauren or your Nike or your manufacturer even and you have some idea or some kind of category of users or consumers that you're trying to target. And um you can just bluntly state what functionality you want. Again, it can be uh looks, like I want the prettiest jacket for XYZ, or I want the most performant um you know, tech outerwear um for uh telluride, you know, and give it kind of like the uh the temperature and tell it kind of like I want a super slim profile. Um and essentially 10 seconds later, 20 seconds later, it generates both that simulated product or garment and its manufacturing files. And then if you don't like what it gave you, then you can say, okay, I like it, but I want the kind of the the shoulders to come in a bit, maybe two centimeters. I also want it to be a little bit more futuristic. Um look at this kind of inspiration from this other person. And I actually want it to, instead of actually um be for mid-winter uh in telluride, let's go to now Siberia. It needs to basically hold up in Siberia. Um so in a couple seconds, you know, 10, 20 seconds, it would generate the new updated version, maybe with like a stronger Cortex um insulated lining and those other kind of uh updates that you gave it. Um and again, it would basically both produce the simulated version um as well as the manufacturing files. And with the simulated version, I could essentially say, I want to see myself in there, I want to see myself running with this jacket on and tell it, you know, your sizing. Um or you could even say uh, you know, you could start asking questions like, okay, what's the durability of this jacket? Like, like how how how many seasons do you think if I run you know this amount of miles every every every week? Um you can start asking it really kind of intelligent questions like that, and it can start um spinning out answers based on its understanding of kind of the physical physical nature of that jacket. So again, it's on one side it's about creation, but it's also about just intelligence. So uh you can ask it questions like you'd ask ChatGPT, but again with ChatGPT right now, it probably wouldn't really know. And if it did know, which it would actually act like it would know, it's a it would kind of uh it would definitely be a sycophant, um, it wouldn't be accurate. So that this model would definitely be accurate. And then in 10 years, um, again, we we kind of move towards a model that's less around like the manual back and forth iterations that a brand might have to do. And it's more around like brand playing kind of creative director, setting the kind of the guidelines parameters, and then um utilizing you know Google Shopping or Google Analytics data from that customer and then generating garments or products for that specific uh user or customer within the guidelines of the brand.

SPEAKER_01

And they will be fully integrated into the manufacturing.

SPEAKER_03

Yeah, and again, as we kind of see the the kind of the trend cycles with uh or kind of the trend line with uh robotics um and and and robotics within manufacturing um uh plants. Um we uh, you know, we there's been a whole lot of hype around, you know, obviously robotic and automated factories, but really what what we need is we need automated product creation. And so that's again, that's kind of where we fit in. And we're hoping to basically hit at right at the same spot in around 10 years where we can fully automate um the product creation side of things, but also um fit that right into the kind of the actual physical manufacturing stack. So we stop we stop just shy of the actual you know machining, essentially.

SPEAKER_01

And in order to know if that jacket that you described is too short for me, or you know, like I would like to alter it a bit. Does that mean I'm uploading a 3D avatar of myself and can virtually try it on, or how would it work?

SPEAKER_03

Yeah, you can. Um again, we even like with Sam 3, uh Meta's you know, uh 3D reconstruction model, it's it's it does a pretty good job of reconstructing avatars. So, you know, just imagine five or ten years later, it'll be really, really good. And that's just from a single image, even that you could upload. So you can either just give it parameters like say, hey, I'm this you know this size, or you can just take an image of yourself, or you scan yourself.

SPEAKER_01

In terms of circular economy, this would allow me to create a digital product pass of every single item and have it in a digital wardrobe and at any time can either resell it or know where to know how to recycle it, etc., right?

SPEAKER_03

100%. So when we're actually embedding these sensors kind of like on-prem with these brands to create their digital materials, that's something we're we're we're looking into heavily right now, is basically imprinting a digital ID with each with each fabric, for instance, that we digitize. And then going forward, if you kind of know the footprint or you know, kind of can uh take your phone and just even like wave the garment around, and it already has its understanding of kind of like the physical nature of that fabric, um, then we think that that's a pretty pretty unique and and kind of neat way to be able to uh trace that that that fabric uh all the way back to its origin, where it was made, when it was made, what the composition of the material was. Um so then you can also understand, you know, the the the case that I like to look at is is like Legos did this a couple years ago, actually with one of our scientists. Um and they basically you take an image of uh a big box of Legos and it would tell you all the different things you can make out of it. That's kind of how I see like if you take a picture of a fabric, um, it could, you know, with you know, LMOD, our physical AI model, um, it could effectively tell you all things you create out of that.

SPEAKER_01

Beautiful. Who is your direct customer? Is it the brand or is it at one point the end user as well?

unknown

Right.

SPEAKER_03

It's it's definitely the brand and the manufacturer. Again, I still think that that that has to play a large part of kind of the future of production. Um because people want to buy from brands. They they they nobody wants to buy from me as much as I want them to. Uh, I still think that you know Ralph Lauren always will play a big, big part in this. And they should be able to do that.

SPEAKER_01

What do you think when you say brand and manufacturer, is it like a platform model where you have to at the same time get both on board, both sides?

SPEAKER_03

No. I when I say brand and manufacturer, I kind of blow the lines because sometimes some people would assume that like Ralph is both brand and manufacturer, or Nike, for instance, even though technically it's third party. Um, but they play such a big role in you know the machines that they buy and everything that it's kind of like a brand manufacturer. So um I really mean brand.

SPEAKER_01

Brand, okay. So for the whole product creation process.

SPEAKER_03

Yeah, and also like even textiles mills, for instance, I know you're very familiar with those. Um we place some of these devices on on premises with with these textile manufacturers um and mills, because if they have all these digital assets and digital twins of their of their materials, then their sell-through rates will increase. Um because if you know Ralph Lauren or, you know, I keep using Ralph Lauren, but just name any brand, is um, you know, creating products digitally. Um, and if they can basically take that asset from you know some mill um that's already digitized the material and then just kind of plop it on their 3D models, then um it goes a long way. We're already seeing that with uh the textile models we're working with. So they do it for back, you know, for free, actually, for their customers.

SPEAKER_01

As an entrepreneur and investor, I'm always interested. Where do you stand today? Where do you stand with your solution, with your technology readiness level, also in terms of funding, team? We'd love to get a bit of a better understanding of how far along you are.

SPEAKER_03

Yeah, so we've commercialized uh TMAX. That's the optical scanning devices. Um we're on three continents with those at the moment. We started commercializing uh around 12 months ago. Um so we're in some of the biggest uh factories and manufacturers at the moment. Um we're working on the next uh T Mac V2. Basically, it's an it's a smaller device that also captures the physical uh torsion tension, all those different properties as well. Um and we are currently training a model, um kind of what we call a la mode 0.5. Uh it's a beta model of the this foundation model, um this physical AI model that will allow brands to essentially take a video of any product under deformation and uh essentially simulate that product um in a much more physically accurate way. So whether that's integrating into their imagery at the moment, um, or whether it's kind of using it as uh the draping it onto um uh different fit models. So because that's a lot of the case right now in fashion. They still have to have fit models like coming to the office, draping that garment, seeing if it sized correctly. Um and we're we're not too far from that. Uh all of our models are uh all the signals we're getting back right now with our benchmarking. Um it it it looks great. Uh just now um kind of a just add more compute kind of thing. It's a lot more compute in it and it just generalizes a bit better. And then in terms of fundraising, uh, we actually just opened around uh last week and we're looking to close hopefully in the next two months um and uh build out the team a bit more. So our team right now consists of around 80% of our team are PhDs. Um and uh we have around 13 people uh at the moment, and then we have around four different um uh part-time kind of academics that are both professor and uh work with Mimazon. So we're just looking at rounding up the team a bit more.

SPEAKER_02

Is it series eight?

SPEAKER_03

No, it's seed. I don't even know what the names mean anymore, honestly. They're kind of all over the map. Uh we've called anything from pre-seed to seed, so I'm not really sure. But I would I would probably say it's seed.

SPEAKER_01

When you started already six years ago, were you bootstrapped until now?

SPEAKER_03

No, we raised a little bit of money from family and friends. Um we've we've been pretty linear mean. So we raised uh uh around a 1.3 million so far today. Um this round is gonna be, I think, uh a good amount more than that. Um yeah, we're looking at raising 12 million at the moment, but uh we've been pretty scrappy.

SPEAKER_01

Wow, okay. But that's impressive that you guys are out and about for already six years and only in in today's world only raised 1.3 million.

SPEAKER_03

We've we've been we've been it we've had clients, so that's always helped. Um we've had some very, very large uh brands and and retailers and manufacturers uh over the past several years that have definitely kept us afloat. So that that's helpful. But it also gives us a good signal because they they like what we're doing, I guess, and we're providing value.

SPEAKER_01

I think that's amazing. Um arcade AI, also uh AI-based design tool. I think they already raised$42 million or something, and they're not around for too long. So it's uh impressive.

SPEAKER_03

I know I world lab just raised like seven bazillion dollars like two days ago, and they've been around for like five minutes. So it always makes me feel bad. If I ever want to feel bad about myself, I just look at world labs and just say they just raise literally like a billion dollars and they've been around for six months.

unknown

Yeah.

SPEAKER_01

Well, it's just how long they all last. Um you already have traction, you have you know it you're already out there, and and you really build a more sophisticated model with with uh real physical data. I think that's what I have not seen from other design tools yet. So it's really exciting. I'm really happy to continue monitoring uh what what's happening and I wish you all the best for your fundraising. Thank you so much, Carl. It's been fun. Thanks for being here. Thanks, Stephen. Thanks.