CIRSE 2010 - The 2nd International Workshop on Contextual Information Access, Seeking and Retrieval Evaluation CIRSE 2010
Date2010-03-28
Deadline2010-01-20
VenueMilton Key, UK - United Kingdom
Keywords
Websitehttps://www.irit.fr/CIRSE
Topics/Call fo Papers
The 2nd International Workshop on Contextual Information Access, Seeking and Retrieval Evaluation
in conjunction with ECIR-2010
Milton Keynes, UK, March 28th, 2010
Aims
Since the 1990s, the interest in the notion of context in Information Access, Seeking and
Retrieval increased. Many researchers have been concerning with the use of context in
adaptive, interactive, personalized or collaborative systems, the design of explicit
and implicit feedback techniques, the investigation of relevance, the application of a notion of
context to problems like advertising or mobile search.
The previous edition of this workshop held in Toulouse (CIRSE 2009) and other workshops and
conferences, i.e. IR in Context (IRiX, 2005), Adaptive IR (AIR, 2006, 2008),
Context-based IR (CIR, 2005, 2007) and Information Interaction in Context (IIiX, 2006, 2008)
gathered researchers exploring theoretical frameworks and applications which have focussed
on contextual IR systems.
An important issue which gave raise to discussion has been Evaluation. It is commonly accepted
that the traditional evaluation methodologies used in TREC, CLEF, NTCIR and INEX campaigns
are not always suitable for considering the contextual dimensions in the information
seeking/access process. Indeed, laboratory-based or system oriented evaluation is challenged by
the presence of contextual dimensions such as user interaction, profile or environment which
significantly impact on the relevance judgments or usefulness ratings made by the end user.
Therefore, new research is needed to understand how to overcome the challenge of user-oriented
evaluation and to design novel evaluation methodologies and criteria for contextual information
retrieval evaluation.
This workshop aims to have a major impact on future research by bringing together IR researchers
working on or interested in the evaluation of approaches to contextual information access, seeking
and retrieval to foster discussion, exchange ideas on the related issues. The main purpose is to
bring together IR researchers, to promote discussion on the future directions of evaluation.
Topics
Both theoretical and practical research papers are welcome from both research and industrial communities
addressing the main conference theme.
Original and unpublished papers are welcome on any aspect including:
User system, context and task modelling for information access seeking and retrieval evaluation.
Novel techniques for implicit or explicit feedback evaluation.
Learning algorithms that use non-traditional relevance judgments (click through data, query streams, user interactions …).
Novel or extension of traditional evaluation measures, test collections, methodologies of operational evaluation.
Contextual and user simulation algorithms.
Accuracy evaluation of personal profiles built using implicit set-level responses.
Merging ranking from collaborative system outputs.
Application and evaluation of context-based systems for distributed retrieval, personal search, mobile search,
digital libraries, archives and museums.
Application and evaluation of context-based access to television broadcasted recordings, images, videos and music collections
in conjunction with ECIR-2010
Milton Keynes, UK, March 28th, 2010
Aims
Since the 1990s, the interest in the notion of context in Information Access, Seeking and
Retrieval increased. Many researchers have been concerning with the use of context in
adaptive, interactive, personalized or collaborative systems, the design of explicit
and implicit feedback techniques, the investigation of relevance, the application of a notion of
context to problems like advertising or mobile search.
The previous edition of this workshop held in Toulouse (CIRSE 2009) and other workshops and
conferences, i.e. IR in Context (IRiX, 2005), Adaptive IR (AIR, 2006, 2008),
Context-based IR (CIR, 2005, 2007) and Information Interaction in Context (IIiX, 2006, 2008)
gathered researchers exploring theoretical frameworks and applications which have focussed
on contextual IR systems.
An important issue which gave raise to discussion has been Evaluation. It is commonly accepted
that the traditional evaluation methodologies used in TREC, CLEF, NTCIR and INEX campaigns
are not always suitable for considering the contextual dimensions in the information
seeking/access process. Indeed, laboratory-based or system oriented evaluation is challenged by
the presence of contextual dimensions such as user interaction, profile or environment which
significantly impact on the relevance judgments or usefulness ratings made by the end user.
Therefore, new research is needed to understand how to overcome the challenge of user-oriented
evaluation and to design novel evaluation methodologies and criteria for contextual information
retrieval evaluation.
This workshop aims to have a major impact on future research by bringing together IR researchers
working on or interested in the evaluation of approaches to contextual information access, seeking
and retrieval to foster discussion, exchange ideas on the related issues. The main purpose is to
bring together IR researchers, to promote discussion on the future directions of evaluation.
Topics
Both theoretical and practical research papers are welcome from both research and industrial communities
addressing the main conference theme.
Original and unpublished papers are welcome on any aspect including:
User system, context and task modelling for information access seeking and retrieval evaluation.
Novel techniques for implicit or explicit feedback evaluation.
Learning algorithms that use non-traditional relevance judgments (click through data, query streams, user interactions …).
Novel or extension of traditional evaluation measures, test collections, methodologies of operational evaluation.
Contextual and user simulation algorithms.
Accuracy evaluation of personal profiles built using implicit set-level responses.
Merging ranking from collaborative system outputs.
Application and evaluation of context-based systems for distributed retrieval, personal search, mobile search,
digital libraries, archives and museums.
Application and evaluation of context-based access to television broadcasted recordings, images, videos and music collections
Other CFPs
- Workshop on Data Dissemination for Large-scale Complex Critical Infrastructures (DD4LCCI 2010)
- 1st International Conference on Reforming Education and Quality of Teaching (TECH-EDUCATION 2010)
- 5th International Workshop on Automation of Software Test ast2010
- Second International Conference on Data Engineering and Management
- 13th International Conference on Interactive Computer aided Learning ICL2010
Last modified: 2010-06-04 19:32:22