Visualizing a New Future for Fashion's Online Experience

12/16/2020 Aaron Seidlitz, Illinois CS

PhD student Kedan Li introduced a new app, Style Space, which features cutting-edge AI visualization technology designed to increase consumer engagement and adoption.

Written by Aaron Seidlitz, Illinois CS

Now in his third year of the PhD program at Illinois CS, Kedan Li is beginning to make a long-standing goal come true—to use his interest and growing knowledge in computer vision to create something new. Li believes that his research has the potential to help transform the e-commerce fashion domain.

Kedan Li
Kedan Li

While AI has led to many amazing applications in fashion retail – shop-the-look entities that allow people to retrieve a fashion product from casual photos or complete-the-look options that allow users to view style and compatibility recommendations – shoppers are unable, as of yet, to visualize-the-look.

Comparatively, other domains have produced exciting new offerings. For example, Grammerly successfully translated consumer text, proving that digital writing assistance can come through AI and natural language processing. Another resource, Otter, takes notes from meetings while also providing insightful analysis.

So Li's challenge included developing something that doesn’t have a track record of success yet in computer vision or fashion.

But that was also part of the appeal to recently launch a startup company and app, called Style Space, which shows off visualization enhancements through technology that Li calls Outfit Visualization Net (OVNet). Available on the App Store or on the website for non-iOS devices, Style Space captures the realistic and accurate display of garments as worn by models – an especially tedious and difficult task.

Because of the technology enhancements, the app allows users to mix and match garments of interest to produce a more interactive experience designed to increase consumer adoption of the products featured.

Taking the tech a step further, Li thought it would be natural to enable the virtual try-on experience if online fashion retailers were interested in it as a service.

A screenshot of the Style Space app.
Style Space allows users to visualize any garment combination on a fashion model

“What I realized at the beginning of this project is that people want to see what they’re buying,” Li said. “No matter what kind of recommendation or description of clothing you can provide, it’s kind of ambiguous. Also, past visualization attempts couldn’t capture the look of the clothing.

“Style Space offers technology that fashion consumers actually want to interact with."

Li did this first by rendering quality images that generate a high level of consumer engagement and adoption. This work dovetailed with the interests of his adviser David A. Forsyth, Fulton Watson Copp Chair in Computer Science.

These two began working together when Li entered the PhD program. Forsyth described this as an “organic” process that grew out of their mutual interest in the impact this technology could have on the fashion industry.

Forsyth’s previous interest in the topic led to a collaboration with faculty colleague Ranjitha Kumar on an outfit-compatibility prediction system, though their attempts to seek research funding came up empty after hearing that fashion is too frivolous.

Beyond the potential technological impact, Forsyth and Li saw an opportunity for something more – an opportunity to create an entirely different digital experience.

David Forsyth
David A. Forsyth

“A significant part of the online shopping experience comes down to entertainment and engagement,” Forsyth said. “People who spend a day going to the mall to buy clothes aren’t going simply to take something off the shelf and pay for it. People spend those eight hours trying the clothes on, thinking about how they go together, how it makes them feel.

“With Style Space, you can look at clothing, pour over other options, see what feels right and what doesn’t. That builds engagement."

To get to this point there were several important and difficult issues about the technology that Li and collaborators had to get right.

Their work on OVNet produced several improvements to capture important details. Their efforts produced high quality virtual try-on images for multiple garments. This meant they captured the difficult-to-render aspects of clothing – e.g. buttons, shading, textures, realistic hemlines, and interactions between garments.

Their method features three important steps to generate an image of a model wearing a set of garments. First, neural network determines where different garments will be in the new picture. Second, a warping procedure adjusts the garments to lie in the right places in the image. Finally, the adjusted garments are passed through an image generation procedure which creates shading detail, garment folds, and so on.

By featuring these developments in Style Space, the tech behind the app quickly gained attention from already established fashion brands.   

“What’s become clear to me, from this kind of reaction, is that we are solving the right problem,” Li said. “It’s also clear that there is a need in this specific marketplace for this kind of technology. What lies ahead is that we need to continue to decide on how best to deliver on this need by turning it into the right product.”

According to Forsyth, this need has only grown over recent years. Brick-and-mortar retail continues to transition to online offerings, boosting efficiency and reducing costs. On top of that trend, COVID-19 mitigation practices generated an even more abundant online audience.

“The vast majority of money in the fashion market is not fantastically expensive objects; it’s mildly expensive objects. So, the margins on those objects is tiny,” Forsyth said. “It’s not efficient to place three of those items in the brick-and-mortar stores across the country in hopes that somebody will buy it. You want to ship those objects from central locations because it’s cheaper."

This means Li's efforts to encourage people to find these items and interact with them online are relevant right now.


Share this story

This story was published December 16, 2020.