
Research Introduction

Dr. Jung aims to integrate his expertise in virtual avatar and AI technologies into the field of Clothing & Textiles. His research explores methodologies such as virtual reality, 3D scanning, motion capture, deep learning, and robotics, focusing on individuals' psychophysiological responses to virtual experience. His future work aims to develop interactive virtual fitting applications based on personalized virtual body and create style recommendation AI systems tailoring both physical and psychological traits, driving innovation in the fashion industry by enhancing the fitting experience and promoting sustainability.

01
Research on Interactive virtual fitting
We aim to develop an immersive / interactive virtual fitting system that allows users to have a fitting experience in their home prior to buying an actual cloth. Which incorporates realistic garment simulation, 3D scanning, motion capture and mixed reality technologies.
02
Research on Style Recommendation AI
We aim to propose an AI that can understand and evaluate human 'style', so that user can easily select which clothes to wear in their everyday lives. To understand this contextual human information (fashion item match, body shape, identity, weather or daily schedule), utilizing large-language model (LLM) can be suitable candidate, along with other image-processing AI's like CNN, GAN or diffusion models for extracting features and visualizing results.

Previous Works

01
Research on Emotion Recognition AI using Body Motion Data
This study proposed a method to visualize emotional body motion (BMM), and showed that emotional statuses can be distinguished using body motion data with deep learning model (CNN).
For more details, see "Bodily Sensation Map vs. Bodily Motion Map: Visualizing and Analyzing Emotional Body Motions (IEEE Transactions on Affective Computing, 2024)."
02
Research on Individualized Rendering Method for Virtual Reality
This study proposed a VR rendering method (Foveated Rendering) using individual's central and peripheral vision. Which effectively reduces computing power as well as preserving VR experience quality.
For more details, see “Individualized foveated rendering with eye-tracking head-mounted display (Virtual Reality, 2024).”


03
Research on Personalized Avatar
This study proposed a method to make a personalized avatar that resembles user's appearance of their face and body sizes using single smartphone. Which can provide high-quality virtual experiences.
For more details, see “Impact of Personalized Avatars and Motion Synchrony on Embodiment and Users’ Subjective Experience: Empirical Study (JMIR Serious Games, 2022).”
04
Research on measuring body size change perception
This study proposed a VR method to measure when an individual recognizes the size change of their body using a size-matched virtual avatar. We speculate that the proposed method can be related to one’s body image flexibility.
For more details, see “Measuring recognition of body changes over time: A human-computer interaction tool using dynamic morphing and body ownership illusion (PLoS One, 2020).”

05
Research on interaction between embodiment subcomponents
This study investigated the relationship between embodiment subcomponents of body ownership, agency and self-location using a point-light avatar. We found that these components can be used to step-wisely enhance user’s subjective virtual experience.
For more details, see “Controlling the sense of embodiment for virtual avatar applications: methods and empirical study (JMIR Serious Games, 2020).”
06
Research on Full-Body Ownership Ilusion
Recent advances in technology have allowed users to experience an illusory feeling of full body ownership of a virtual avatar. Such virtual embodiment has the power to elicit perceptual, behavioral, cognitive, and emotional changes related to oneself.
For more details, see “Full-body ownership illusion can change our emotion (ACM CHI, 2018).”