MVAR 2016 - Multimodal Virtual and Augmented Reality - MVAR 2016
Topics/Call fo Papers
Virtual reality (VR) and augmented reality (AR) are currently two of the "hottest" topics in the IT industries. Many consider them to be the next wave in computing with a similar impact as the shift from desktop systems to mobiles and wearables. Yet, we are still far from the ultimate goal of creating new virtual environments or augmentations of existing ones that feel and react similarly as their real counterparts. Many challenges and open research questions remain - especially in the areas of multimodality and interaction.
The aim of this workshop is therefore to investigate any aspects about multimodality and multimodal interaction in relation to VR and AR. What are the most pressing research questions? What are the difficult challenges? What opportunities do other modalities than vision offer for VR and AR? What are new and better ways for interaction with virtual objects and for an improved experience of VR and AR worlds?
We invite researchers and visionaries to submit their latest results on any aspects that are relevant for multimodality and interaction in VR and AR. Contributions of more fundamental nature (e.g., psychophysical studies and empirical research about multimodality) are welcome as well as technical contributions (including use cases, best-practice demonstrations, prototype systems, etc.). Position papers and reviews of the state-of-the art and ongoing research are invited, too. Submissions do not necessarily have to address multiple modalities, but work focusing on single modes that go beyond the state-of-the-art of “purely visual” systems (e.g., papers about smell, taste, and haptics) are suited, as well.
The aim of this workshop is therefore to investigate any aspects about multimodality and multimodal interaction in relation to VR and AR. What are the most pressing research questions? What are the difficult challenges? What opportunities do other modalities than vision offer for VR and AR? What are new and better ways for interaction with virtual objects and for an improved experience of VR and AR worlds?
We invite researchers and visionaries to submit their latest results on any aspects that are relevant for multimodality and interaction in VR and AR. Contributions of more fundamental nature (e.g., psychophysical studies and empirical research about multimodality) are welcome as well as technical contributions (including use cases, best-practice demonstrations, prototype systems, etc.). Position papers and reviews of the state-of-the art and ongoing research are invited, too. Submissions do not necessarily have to address multiple modalities, but work focusing on single modes that go beyond the state-of-the-art of “purely visual” systems (e.g., papers about smell, taste, and haptics) are suited, as well.
Other CFPs
- 2nd workshop on Emotion Representations and Modelling for Companion Systems (ERM4CT 2016)
- 2nd International Workshop on Advancements in Social Signal Processing for Multimodal Interaction (ASSP4MI2016)
- 1st Workshop on Embodied Interaction with Smart Environments
- International Workshop on Multimodal Analyses enabling Artificial Agents in Human-Machine Interaction
- IWPR 2017 the Second International Workshop on Pattern Recognition_Ei, Scopus,ISI
Last modified: 2016-07-27 17:12:13