Manipulation der virtuellen Selbstwahrnehmung durch visuell-haptische Avatar-Parameter
User representations within interactive systems are essential for creating effective, efficient, and satisfying tools. In mixed reality, this representation is called an avatar and describes virtual characters through which users interact in virtual spaces. Combining avatars with visual-motor synchrony creates such strong cues that users have the feeling of embodying these avatars and accept them as their virtual alter ego. This makes avatars a promising tool for many applications, such as fear therapy, body image disturbances treatment, and firefighter training, as people can be confronted with mental and physical challenges in a simulation but without putting them or others in danger. Previous work shows that avatars’ characteristics can change users’ behavior, attitude, and perception. Users, for example, display a higher cognitive performance when embodying an avatar that looks like Albert Einstein and perceive the world through the eyes of a child when seeing it from a lower perspective. While a large body of research focused on the effects of the audio-visual representation of avatars, only recently haptics gaining more interest when investigating the effects of avatars. It has, for example, been shown that lifting weights is perceived as easier when embodying muscular avatars. While better understanding of haptics would increase the immersion and benefits psychotherapy as well as manual and motion skill learning, we still lack fundamental knowledge of how visuo-haptic avatar design changes our virtual alter ego (or self-) perception.
This project will systematically investigate the combined effect of avatars’ visual presentations and haptic stimuli on users’ self-perception. Thereby, we aim at enabling users to “slip into others’ skin” and not only see the world from a different perspective but also feel the world accordingly. This increases the immersion of virtual environments, helps users to empathize with avatars, enables further virtual reality applications, and ultimately increases our understanding of the underlying phenomena.
In empirical studies, we systematically investigate the effects of avatars’ visuo-haptic on users’ self-perception. We will determine how the avatar’s visual appearance affects the user’s haptic body perception, such as perceived strength, endurance, and body shape. A muscular avatar, for example, might make lifted weights perceived as lighter. In addition, we will integrate haptic feedback devices (such as exoskeletons and weight-changing devices) that dynamically manipulate actual physical performance, e.g., through manipulating the physical effort when lifting objects. Afterward, we combine both research directions and investigate the interaction of avatar appearance and haptic devices. The developed avatars and haptic stimuli will be combined to reveal systematic effects, resulting in a model describing the interaction of visual and haptic rendering on the virtual self-perception.
Partner
Universität Regensburg, Institut für Information und Medien, Sprache und Kultur, Lehrstuhl für Medieninformatik, Prof. Dr. Niels Henze
Mittelgeber
DFG Schwerpunktprogramm, Teilprojekt zu SPP 2199: Skalierbare Interaktionsparadigmen für allgegenwärtige Rechnerumgebungen
Laufzeit
seit 2024
Kontakt
Prof. Dr.-Ing. Katrin Wolf