ERM4HCI 2013 - Workshop on Emotion Representations and Modelling for HCI - ERM4HCI 2013
Topics/Call fo Papers
This workshop complements the ICMI 2013 by offering a platform that focuses on emotion representations and modelling for Human-Computer Interaction (HCI) systems. Emotion representations and models are often both modality and discipline specific and hence hardly interoperable. The workshop encourages the discussion of both technical and theoretical approaches to emotion modelling and representations in order to aid in the development of efficient, verifiable and implementable emotion models for affective systems.
Abstract
To develop user adaptable HCI, the role of emotions occurring during the interaction became increasingly valuable over the past years. Emotions, being widely accepted as essential to Human-Human-Interaction, became relevant for designers of affective systems in order to provide a natural, user-centred interaction. However, to adequately incorporate emotions in modern HCI systems, results from varying research disciplines must be combined.
An affective system must be able to detect, identify, process and respond to emotions occurring during HCI in realtime. Hence, the methods involved in emotion processing must be reliable, efficient and well-defined throughout the systems abstraction layers. In order to fulfil these requirements, conceptual work is needed to define suitable technical models on emotion, disposition and behaviour. To allow technical systems to automatically process and respond to emotions, the defined models must link machine-detectable, physical, human characteristics to abstract strategic algorithms and be consistent with the predominant emotion theories and observations.
Emotions occurring during HCI are detected feature-based, depending on the modalities provided and implying the necessity of identifying significant features. Furthermore, these features must be processed in an adequate way to allow the automatic distinguishing of emotions. In multi-modal systems sufficient fusion techniques must be defined to maximize the correctness of the emotion detection and synthesis. To respond to the detected emotions, the system should be provided with an artificial emotional intelligence. Artificial emotional intelligence in turn requires a combination of knowledge base processing, history processing, plan construction, coping strategies, user models, emotion and behaviour theories.
Currently, several solutions on incorporating emotions in technical systems exist. However, these solutions depend heavily on the scope, application and modalities used. The applied concepts tend to be highly specific and layer dependent, often lacking universality and interoperability. To allow cognitive technical systems to become truly affective and user adaptable, all methods used must have a reliable theoretic foundation and cooperate.
Abstract
To develop user adaptable HCI, the role of emotions occurring during the interaction became increasingly valuable over the past years. Emotions, being widely accepted as essential to Human-Human-Interaction, became relevant for designers of affective systems in order to provide a natural, user-centred interaction. However, to adequately incorporate emotions in modern HCI systems, results from varying research disciplines must be combined.
An affective system must be able to detect, identify, process and respond to emotions occurring during HCI in realtime. Hence, the methods involved in emotion processing must be reliable, efficient and well-defined throughout the systems abstraction layers. In order to fulfil these requirements, conceptual work is needed to define suitable technical models on emotion, disposition and behaviour. To allow technical systems to automatically process and respond to emotions, the defined models must link machine-detectable, physical, human characteristics to abstract strategic algorithms and be consistent with the predominant emotion theories and observations.
Emotions occurring during HCI are detected feature-based, depending on the modalities provided and implying the necessity of identifying significant features. Furthermore, these features must be processed in an adequate way to allow the automatic distinguishing of emotions. In multi-modal systems sufficient fusion techniques must be defined to maximize the correctness of the emotion detection and synthesis. To respond to the detected emotions, the system should be provided with an artificial emotional intelligence. Artificial emotional intelligence in turn requires a combination of knowledge base processing, history processing, plan construction, coping strategies, user models, emotion and behaviour theories.
Currently, several solutions on incorporating emotions in technical systems exist. However, these solutions depend heavily on the scope, application and modalities used. The applied concepts tend to be highly specific and layer dependent, often lacking universality and interoperability. To allow cognitive technical systems to become truly affective and user adaptable, all methods used must have a reliable theoretic foundation and cooperate.
Other CFPs
- Workshop on Big Data Mining Techniques for Online Sales and Customer Service
- fourth annual Wireless Health conference
- Second International Workshop on SOcial and MObile COmputing for collaborative environments (SOMOCO'13)
- Fourth International Workshop on Semantics and Decision Making (SeDeS’13)
- 9th International Workshop on Ontology Content (OnToContent 2013)
Last modified: 2013-04-23 07:01:31