CMMR 2012 - The 9th International Symposium on Computer Music Modeling and Retrieval (CMMR) Music and Emotions
Topics/Call fo Papers
The 9th International Symposium on Computer Music Modeling and Retrieval (CMMR) Music and Emotions will take place at Queen Mary University of London on 19-22 June 2012. Jointly organised by the Centre for Digital Music and the CNRS - Laboratoire de Mécanique et d'Acoustique (France), CMMR 2012 welcomes researchers, educators, librarians, (film) music composers, performers, music software developers, members of industry, and others with an interest in computer music modeling, retrieval, analysis, and synthesis to join us for what promises to be a great event.
CMMR is an interdisciplinary conference involving fields such as computer science, engineering, information retrieval, human computer interaction, digital libraries, hypermedia, artificial intelligence, acoustics, audio and music signal processing, musicology, music perception and cognition, neuroscience, as well as music composition and performance. The first CMMR gatherings (see the CMMR History page) mainly focused on information retrieval, programming, digital libraries, hypermedia, artificial intelligence, acoustics and signal processing. Little by little CMMR has moved towards more interdisciplinary aspects related to the role of human interaction in musical practice, perceptual and cognitive aspects linked to sound modelling and how sense or meaning can be transmitted either from isolated sounds or musical structure as a whole. All the CMMR gatherings have resulted in post symposium proceedings containing selected peer-reviewed papers and published by Springer Verlag in the Lecture Notes in Computer Sciences Series (LNCS 2771, LNCS 3310, LNCS 3902, LNCS 4969, LNCS 5493, LNCS 5954, LNCS 6684), and this is also planned for CMMR 2012.
Music and Emotions
This year, we encourage the submission of contributions on the theme of Music and Emotions. Music can undoubtedly trigger various types of emotions within listeners. The power of music to affect our mood may explain why music is such a popular and universal art form. This is probably due to the fact that, as human listeners, we are hard-wired to enjoy music. Research in cognitive science has shown that some music pieces can enhance our intellectual faculties in given conditions because they change our mood and induce positive affects. Music psychology studies have shown our ability to discriminate various types of expressive intentions and emotions in the composer/performer musical message. But the understanding of the genesis of musical emotions, the mapping of musical variables to emotional responses, and the automatic retrieval of high-level descriptors characterising emotions, remain complex research problems. Some examples of open research questions are:
Are there any acoustical correlates to emotions which can be used for classification or recommendation purposes, and if so, what are they?
How can we navigate digital music collections using emotional descriptors?
Are there specific cues in the score that can explain our emotional responses?
Which has the greatest influence upon emotional responses - lyrics or instrumentation?
Can emotions be predicted through the analysis of lyrics?
How do changes of timbre influence the emotional responses of listeners?
How can we synthesise sound and music based on high-level emotion descriptors?
How do performers articulate the emotional intentions of composers?
How do a performer's expressive effects influence the listener's emotions?
Are there invariant emotional responses that transcend cultures, or is the concept of emotions intrinsically linked with our culture and education?
What processes do film and tv music composers use to support (or contrast with) the emotional content expressed by images?
Research questions such as these can be addressed by a multitude of research disciplines, including music information retrieval, musicology, music psychology and cognition, as well as fields like soundtrack composition.
Contributions on others topics as described in the Call for contributions are also welcome.
CMMR is an interdisciplinary conference involving fields such as computer science, engineering, information retrieval, human computer interaction, digital libraries, hypermedia, artificial intelligence, acoustics, audio and music signal processing, musicology, music perception and cognition, neuroscience, as well as music composition and performance. The first CMMR gatherings (see the CMMR History page) mainly focused on information retrieval, programming, digital libraries, hypermedia, artificial intelligence, acoustics and signal processing. Little by little CMMR has moved towards more interdisciplinary aspects related to the role of human interaction in musical practice, perceptual and cognitive aspects linked to sound modelling and how sense or meaning can be transmitted either from isolated sounds or musical structure as a whole. All the CMMR gatherings have resulted in post symposium proceedings containing selected peer-reviewed papers and published by Springer Verlag in the Lecture Notes in Computer Sciences Series (LNCS 2771, LNCS 3310, LNCS 3902, LNCS 4969, LNCS 5493, LNCS 5954, LNCS 6684), and this is also planned for CMMR 2012.
Music and Emotions
This year, we encourage the submission of contributions on the theme of Music and Emotions. Music can undoubtedly trigger various types of emotions within listeners. The power of music to affect our mood may explain why music is such a popular and universal art form. This is probably due to the fact that, as human listeners, we are hard-wired to enjoy music. Research in cognitive science has shown that some music pieces can enhance our intellectual faculties in given conditions because they change our mood and induce positive affects. Music psychology studies have shown our ability to discriminate various types of expressive intentions and emotions in the composer/performer musical message. But the understanding of the genesis of musical emotions, the mapping of musical variables to emotional responses, and the automatic retrieval of high-level descriptors characterising emotions, remain complex research problems. Some examples of open research questions are:
Are there any acoustical correlates to emotions which can be used for classification or recommendation purposes, and if so, what are they?
How can we navigate digital music collections using emotional descriptors?
Are there specific cues in the score that can explain our emotional responses?
Which has the greatest influence upon emotional responses - lyrics or instrumentation?
Can emotions be predicted through the analysis of lyrics?
How do changes of timbre influence the emotional responses of listeners?
How can we synthesise sound and music based on high-level emotion descriptors?
How do performers articulate the emotional intentions of composers?
How do a performer's expressive effects influence the listener's emotions?
Are there invariant emotional responses that transcend cultures, or is the concept of emotions intrinsically linked with our culture and education?
What processes do film and tv music composers use to support (or contrast with) the emotional content expressed by images?
Research questions such as these can be addressed by a multitude of research disciplines, including music information retrieval, musicology, music psychology and cognition, as well as fields like soundtrack composition.
Contributions on others topics as described in the Call for contributions are also welcome.
Other CFPs
- 2012 19th International Workshop on Active-Matrix Flatpanel Displays and Devices (AM-FPD)
- International Workshop on Mobile Platform, Computing and Applications (MPCA-12)
- International Workshop on Biological IT-System Convergence (BITSC-12)
- International Workshop on Mobile Could Platforms and Applications (MCPA-12)
- 10th european interactive tv conference (EUROITV 2012)
Last modified: 2011-11-16 19:43:57