Real-time Intersensory Discomfort Compensation (DFG)

Sensory illusions are key to creating believable mixed-reality experiences. Visual illusions and the creation of virtual worlds through animated images are well-researched. To advance towards a more realistic blending of the physical and the virtual, multimodal illusions are very powerful in this context. However, for many people that is not working well, as they feel discomfort in these settings. If two or more sensory modalities provide different and conflicting information, one sensory modality can overwrite the information sensed with another one. So far, there is no systematic understanding of how multimodal illusions can create a convincing mixed-reality environment that avoids discomfort.

Several research projects, including our previous work, have demonstrated the feasibility of creating technical implementations of such illusions. In this project, we aim to systematically research the foundation of multimodal sensory illusions in mixed reality. As a key aspect of our experimental research, we use the phenomenon of discomfort in mixed reality. If the multisensory information is incoherent, humans react to this. If cognition does not integrate the information of multiple senses, we feel discomfort. Motion sickness and cybersickness are examples where this mismatch of felt and seen motion has not been integrated into coherent cognition. Still, instead, people feel sick as the effect of that sensory mismatch.

Our idea is to use physiological sensing to detect the onset of such a mismatch before people feel discomfort. If this is possible, we could correct the mismatch and create a lasting and comprehensive illusion in mixed reality. This builds on research that has investigated under what conditions intersensory integration can be realized using technology. This helps to systematically build conceptual models of what sensory information combination can create what sensory illusion. We extend such static models by physiological sensing to overcome intra- and interpersonal differences cognitive models contain.

The vision is to research the scientific foundation for a new generation of mixed-reality devices and applications, that can sense that the illusion is about to break, and counter this. If we can measure in real-time at what moment an individual user of an interactive system cannot integrate multisensory information anymore, the system could adapt the multisensory output to avoid discomfort. An example of this is the adaptation of the visual scene once the onset of the breaking illusion is detected. If possible, this could make MR technologies applicable to a much wider range of people and for a wider range of applications and, thus, create a novel efficient, and meaningful interaction paradigm through Real-time Intersensory Discomfort Compensation.

(https://gepris.dfg.de/gepris/projekt/521602817)

Partner

LMU München, Prof. Dr. Albrecht Schmidt, Institut für Informatik, LFE Medieninformatik

Mittelgeber

DFG Schwerpunktprogramm, Teilprojekt zu SPP 2199:  Skalierbare Interaktionsparadigmen für allgegenwärtige Rechnerumgebungen

Laufzeit

01.01.2024 - 31.12.2026

Kontakt

Prof. Dr.-Ing. Katrin Wolf