A technique that predicts what clothes would look like on different people

RGB image of a client. Credit: Tiwari & Bhowmick.

In recent years, some computer scientists have explored the potential of deep learning techniques to virtually dress digital 3D versions of humans. Such techniques could have many interesting applications, especially for online shopping, gaming, and 3D content generation.

Two researchers from TCS Research in India recently created a deep learning technique that can predict how clothes will adapt to a given body shape and therefore what they will look like on specific people. This technique, presented at the ICCV workshop, proved to be more efficient than other existing virtual body garment methods.

“Online clothing shopping allows consumers to access and purchase a wide range of products from the comfort of their homes, without going to physical stores,” Brojeshwar Bhowmick, one of the researchers, told TechXplore. who conducted the study. “However, it has a major limitation: it does not allow buyers to physically try on the clothes, resulting in a high return / exchange rate due to clothing fit issues. The concept of virtual try-on helps resolve this limitation. “

Virtual fitting tools allow people who buy clothes online to get an idea of ​​the size and appearance of a garment, visualizing it on a 3D avatar (i.e. a version digital of themselves). The potential buyer can deduce how the item they are planning to buy fits by looking at the folds and wrinkles of it in different positions or from different angles, as well as the gap between the body of the avatar and the garment worn in the rendered image / video.






Youtube RGB and Draped Image with a T-shirt and pants. Credit: Tiwari & Bhowmick.

It allows buyers to view any garment on a 3D avatar, as if they were wearing it. Two important factors that a buyer takes into account when deciding to buy a particular garment are fit and appearance. In a virtual try-on setup, a person can infer how a particular garment fits by looking at the folds and wrinkles in various poses and the gap between the body and the garment in the rendered image or video.

“Previous work in this area, such as the development of the TailorNet technique, does not take into account the underlying measurements of the human body; thus, its visual predictions are not very accurate, in terms of fit,” said said Bhowmick. “On top of that, due to its design, TailorNet’s memory footprint is huge, which restricts its use in real-time applications with less computing power.”

The main goal of the recent study by Bhowmick and colleagues was to create a lightweight system that takes a human’s measurements into account and drapes 3D clothing over an avatar that matches those measurements. Ideally, they wanted this system to require low memory and low computing power, so that it could be run in real time, for example on online clothing websites.

DeepDraper: a technique that predicts the appearance of clothes on different people

The estimated 3D body of the same client in the image above, derived from the RGB image. Credit: Tiwari & Bhowmick.

“DeepDraper is a deep learning-based garment draping system that allows customers to virtually try on clothes from a digital wardrobe on their own body in 3D,” Bhowmick explained. “Essentially, it takes an image or a short video clip of the customer, and an item of clothing from a digital wardrobe provided by the vendor as inputs.”

Initially, DeepDraper analyzes a user’s images or videos to estimate their body shape, pose and 3D measurements. It then transmits its estimates to a draped neural network that predicts what clothing would look like on the user’s body, applying it to a virtual avatar.

The researchers evaluated their technique in a series of tests and found that it outperformed other advanced approaches, as it predicted how a garment would fit users better and more realistically. In addition, their system was able to drape clothing of all sizes over human bodies of all shapes and with various characteristics.

  • DeepDraper: a technique that predicts the appearance of clothes on different people

    Result of DeepDraper, where the team draped the estimated human body in 3D with a white T-shirt and pink pants. Credit: Tiwari & Bhowmick.

  • DeepDraper: a technique that predicts the appearance of clothes on different people

    Result of draping a fixed size T-shirt on two people with varying overall body fat. This is the picture showing the person with higher body fat, see the following picture to observe the differences in wrinkles and creases. Credit: Tiwari & Bhowmick.

  • DeepDraper: a technique that predicts the appearance of clothes on different people

    Result of draping a fixed size T-shirt on two people with varying overall body fat. This is the image showing the person with lower body fat, see the previous image to observe the differences in wrinkles and creases. Credit: Tiwari & Bhowmick.

“Another important feature of DeepDraper is that it is very fast and can be supported by low-end devices such as cell phones or tablets,” Bhowmick said. “More specifically, DeepDraper is nearly 23 times faster and nearly 10 times smaller in memory footprint than its close competitor Tailornet. “

In the future, the virtual clothing draping technique created by this team of researchers could enable clothing and fashion companies to improve their users’ experience with online shopping. By allowing potential buyers to get a better idea of ​​how clothes will look before purchasing them, it could also reduce requests for refunds or product exchanges. Additionally, DeepDraper could be used by game developers or creators of 3D media content to dress characters more effectively and realistically.

“In our next studies, we plan to expand DeepDraper to virtually try on other tough, loose, multi-layered garments, such as dresses, gowns, t-shirts with jackets, etc. Currently, DeepDraper is draping the garment over a static human body, but eventually we plan to drape and animate the garment consistently as humans move. ”


Say goodbye to a disposable fashion industry


More information:
DeepDraper: fast and precise draping of a garment in 3D on a human body in 3D. The Computer Vision Foundation(2021). PDF

© 2021 Science X Network

Quote: DeepDraper: A technique that predicts what clothes would look like on different people (2021, October 26) retrieved October 26, 2021 from https://techxplore.com/news/2021-10-deepdraper-technique-people.html

This document is subject to copyright. Other than fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.


Source link

Comments are closed.