EISE 2016 - 1st Workshop on Embodied Interaction with Smart Environments
Topics/Call fo Papers
Our homes become increasingly smart through modular hardware and software apps controlling home automation functions such as setting the room temperature or starting the washing machine. Also, mobile robots start to enter our homes as vacuum cleaners, mobile cell phone platforms or toys. All of these come with their own interfaces resulting not only in a multitude of different interaction devices with different interaction philosophies. Additionally, the increasingly embodied capabilities of smart devices, ranging from ambient actions (light, sound..) to moving objects (robots, furniture...) yield to the overwhelming amount of information and control that needs to be mastered within such a convoluted environment. Yet, despite large research efforts, the main modality of interaction with smart home devices is often still a challenging graphical interface.
Such a complex situation opens up a range of new research questions pertaining to the interaction with smart environments. How can they be made more intuitive and adaptive? And how to deal with agency or explicit lack of it, i.e. whom to address when specifying a command or a goal situation? In this workshop we want to address the question how the various installments inside a smart environment can be used as intuitive means of interaction.
This entails on the one hand questions regarding the human partner - in how far do users profit from embodied interaction partners (e.g. virtual agents or robots) as opposed to non-embodied devices? Do the different devices and agents have to provide a coherent interaction? On the other hand, this entails questions of situation awareness with respect to the interaction partner. How can the environment be attentive with respect to the interaction partner's intentions without having to overhear all conversations and interactions that are not addressed towards the environment?
Such a complex situation opens up a range of new research questions pertaining to the interaction with smart environments. How can they be made more intuitive and adaptive? And how to deal with agency or explicit lack of it, i.e. whom to address when specifying a command or a goal situation? In this workshop we want to address the question how the various installments inside a smart environment can be used as intuitive means of interaction.
This entails on the one hand questions regarding the human partner - in how far do users profit from embodied interaction partners (e.g. virtual agents or robots) as opposed to non-embodied devices? Do the different devices and agents have to provide a coherent interaction? On the other hand, this entails questions of situation awareness with respect to the interaction partner. How can the environment be attentive with respect to the interaction partner's intentions without having to overhear all conversations and interactions that are not addressed towards the environment?
Other CFPs
- International Workshop on Multimodal Analyses enabling Artificial Agents in Human-Machine Interaction
- IWPR 2017 the Second International Workshop on Pattern Recognition_Ei, Scopus,ISI
- International Conference on Research & Innovation in Food, Agriculture and Biological Sciences (RIFABS-16)
- International Trade & Academic Research Conference (ITARC)
- 22nd International Conference on Hydraulics and Pneumatics HERVEX
Last modified: 2016-07-27 17:10:51