8+ years programming for fashion
Mon, 01 Jan 2024 12:37:55 +0000
A visual summary of some of the stuff I've worked on at Metail
A visual summary of some of the stuff I've worked on at Metail

A retrospective

My time at Metail has come to an end, and I thought it was a good time to write a retrospective longer than usual. It’s been an interesting ride, so I think it’s worth to look back and review the things I learned.

8 years and a half is the longest I’ve ever been in a company. The main reason for staying this long has been the incredibly welcoming atmosphere and the friendly people that worked there, but there were also very interesting technical challenges that aligned well with my technical background. Let me give you a bit more detail.

How I ended up here

I’m a graphics programmer and before Metail I’ve always worked in the games industry. The games industry is full of amazing people, and very passionate as well. I suppose passion can sometimes transform into angry faces and needless shouting. The year I left the games industry I wasn’t going through the best of times. My mum was fighting cancer, and then she got a brain stroke. So it was hard coping with the stress of work. My counsellor suggested that a change may be good.

I got a random phone call from a recruiter and they talked about Metail. I wasn’t very interested in fashion at the time, but the technology that they described sounded quite interesting. They needed someone with knowledge in Computer Graphics, but also with a good understanding of Computer Vision and Image Processing. Although professionally I had mostly worked on graphics and optimization, my PhD was on Computer Vision and Image Processing. So this sounded like a nice combo! I passed the interviews and I got in.

The MeModel era

The main product when I entered Metail was called the MeModel (see figure on top). It was a virtual try-on system where users would enter their measurements to generate a virtual avatar, and then they could try on clothes. It was a web application that retailers could install in their websites. The technology was a mixture of 2D, photographs of clothes and faces, and 3D for the body shapes. The garment physics were done in 2D.

The technology I was maintaining was a server-side renderer written in DirectX 10, C#, and C++. After getting familiar with the pipeline and the asset publishing, I started by optimizing the performance by removing redundant textures and unnecessary processing. Sometimes graphics code becomes spaghetti 🍝, but a simple PIX GPU frame capture can reveal very interesting things quite easily.

I also worked on improving the visuals. I introduced a new avatar with more joints and I contacted an ex-colleague to help us author more poses. I changed the skin shaders, and I wrote a WebGL tool to help us tweak the skin to match the photographic heads (see Skin colour authoring using WebGL).

I also did some server-side work. Because I had some previous experience with NodeJS, I suggested building a small server in NodeJS for scaling and monitoring. This sat on top of AWS services, but it let us do more complex logic suited to our renderer. The new bottleneck was the starting time of the renderer service —it took several minutes to boot. I looked at some old spaghetti code and rewrote it into a math paper, and then rewrote the whole thing with simpler matrix multiplications. Also, I turned most of the asset loading into lazy initializations, for a final starting time under 2 seconds.

I built several internal visualization tools, and other tools for the outsourcing teams to help them see what they were creating, for faster iterations (from days to hours). I also became Engineering Manager of a team of 7 and I mentored other developers. I did lots of interesting things.

Unfortunately, the MeModel didn’t quite take off and the company struggled financially until we were acquired by one of our investors.

The EcoShot era

A scan of myself, my scanatar superimposed on a photograph, and a couple of EcoShot renders
From left to right: a scan of myself, my scanatar superimposed on a photograph, and a couple of EcoShot renders

When we shut down the MeModel service I was working on an idea from our CTO. He thought that in order to strive for realism, we needed to do the garment simulation in 3D. We were experimenting with some 3D CAD cloth authoring software at the time, and I thought it would be relatively simple to reuse all the technology we had to create something for that software.

Unfortunately, all the client-side developers had to go. So I had to build everything on my own. But the CAD software let you write plugins in Python, so it wasn’t quick to get started. I like C++, but Python let us build things faster in this scenario.

I started by getting a body scan of myself and using our software to automatically rig it, add some poses, and import it into the software. That's what we call a “scanatar”, i.e. an avatar originated from a scan. When I saw the draping of a single garment in different sizes on an accurate model of my body, I thought this would be a game changer.

I built a beta of the software in a couple of months, all self-contained —there was no service at the time. After the beta, I worked with the network architect to build a service. I built a renderer that used V-Ray to render the garments using raytracing. For the 2D composition I used mainly ImageMagick, and some OpenCV scripts written by our R&D team.

Apart from EcoShot, we worked on other projects, such as the European eTryOn project (see my XR4Fashion talk from 2:00:00, From 3D designs to Lens Studio: Challenges in faithful garment representation), and some other AR collaborations with Snap (see me wearing a virtual Puma tracksuit in the figure on top, using one of my Snapchat filters). So I got to touch some game engines as well, like Unity or Lumberyard, and Lens Studio (see some mentions in Reasons for a solo dev to love Godot Engine).

2023 has been an interesting year as well with the boom of Generative AI (GenAI for short). I worked on releasing new features and new GenAI avatars for the EcoShot plugin at a very fast pace. Many customers were impressed by the results, and we've been getting requests for new imagery.

The End & The Future

Unfortunately, we ran out of time. EcoShot will continue to exist in the hands of Tronog (see the announcement: Metail and Tronog enter into a strategic partnership to transfer EcoShot and make AI-generated fashion accessible to all). By the way, can you tell which models are real and which are GenAI?

Image from Metail website showing some EcoShot & GenAI models
Image from Metail website showing some EcoShot & GenAI models

There are still many exciting things to come for EcoShot in 2024, but I will be moving on. At the time of writing, I don’t know yet where to, though. It seems it was still early for many apparel companies to adopt 3D, so I may not be working again in fashion. Who knows.

I was attracted to the idea of doing something good for the planet. The fashion industry, specially fast fashion, is a machinery of creating waste. Creating virtual samples before they are manufactured should help reduce some waste. Also, showing customers how the garment fits in different body shapes should help reduce returns. But these technologies still have a slow adoption. I hope that GenAI will revolutionize that.

While I look for my next adventure, I will be working on some side projects. I recently released an image diff app called Mantis Shrimp 🦐. I borrowed the name from a Javascript web tool made by my team lead when I entered Metail. He loves Mantis Shrimps because of the 16 or so photoreceptor cells in their eyes. I thought it was a nice way of coming back full circle.

So long and thanks for all the fish 🐋🌈

Happy New Year 2024 🐲


◀️ Older | Newer ▶️

⏪ Previous year | Next year ⏩