BEA 2012 - The 7th Workshop on the Innovative Use of NLP for Building Educational Applications
Topics/Call fo Papers
Research in NLP applications for education continues to progress using innovative NLP techniques - statistical, rule-based, or most commonly, a combination of the two. As a community we are continuing to improve existing capabilities and to identify and generate innovative ways to use NLP in applications for writing, reading, speaking, critical thinking, curriculum development, and assessment. Steady growth in the development of NLP-based applications for education has prompted an increased number of workshops, typically focusing on one specific subfield.
In this workshop, we solicit papers from all subfields: automated scoring, intelligent tutoring, learner cognition, use of corpora, grammatical error detection, and tools for teachers and test developers. Since the first workshop in 1997, "Innovative Use of NLP in Building Educational Applications" has continued to bring together all NLP subfields to foster interaction and collaboration among researchers in both academic institutions and industry. The workshop offers a venue for researchers to present and discuss their work in these areas. Each year, we see steady growth in workshop submissions and attendance, and the research has become more innovative and advanced. In 2012, we expect that the workshop (consistent with previous workshops at ACL 1997, NAACL/HLT 2003, ACL 2005, ACL 2008, NAACL/HLT 2009, NAACL/HLT 2010, and ACL 2011), will continue to expose the NLP research community to technologies that identify novel opportunities for the use of NLP techniques and tools in educational applications. At ACL 2011, the workshop introduced a poster session that was lively and well-attended. We plan to continue to have poster sessions as a regular feature.
The practical need for language-analysis capabilities has been driven by increased requirements for state and national assessments, and a growing population of foreign and second language learners. There are currently a number of commercial systems that handle automated scoring of free-text and speech in the context of assessment as well as systems that address linguistic complexity in text - commonly referred to as readability measures. More recently, the need for applications for language analysis is emphasized by a new influence in the educational landscape in the United States, specifically, the Common Core State Standards initiative: (http://www.corestandards.org/) that is coordinated by the National Governors Association Center for Best Practices and the Council of Chief State School Officers. The initiative has now been adopted by 46 states for use in Kindergarten through 12th grade (K-12) classrooms. This initiative is likely to have a strong influence on teaching standards in K-12 education. The Common Core standards describe what K-12 students should be learning with regard to Reading, Writing, Speaking, Listening, Language, and Media and Technology. In addition, the Common Core recently released a Publishers Criteria document that describes the array of linguistic elements that learners need to grasp as they progress to the higher grades (http://www.corestandards.org/assets/Publishers_Cri...). The Common Core thereby introduces language analysis scenarios that have clear alignments with NLP research and applications.
The workshop will solicit both full papers and short papers for either oral or poster presentation. This year, the Helping Our Own (HOO-2) Shared Task on grammatical error detection will be co-located with the BEA7 workshop.
Given the broad scope of the workshop, we organize the workshop around three central themes in the educational infrastructure:
Development of curriculum and assessment (e.g., applications that help teachers develop reading materials)
Delivery of curriculum and assessments (e.g., applications where the student receives instruction and interacts with the system);
Deporting of assessment outcomes (e.g., automated scoring of free responses)
Topics will include, but will not be limited to, the following:
Automated scoring/evaluation for oral and written student responses
Content analysis for scoring/assessment
Grammatical error detection and correction
Discourse and stylistic analysis
Plagiarism detection
Machine translation for assessment, instruction and curriculum development
Detection of non-literal language (e.g., metaphor)
Sentiment analysis
Intelligent Tutoring (IT) that incorporates state-of-the-art NLP methods
Dialogue systems in education
Hypothesis formation and testing
Multi-modal communication between students and computers
Generation of tutorial responses
Knowledge representation in learning systems
Concept visualization in learning systems
Learner Cognition
Assessment of learners' language and cognitive skill levels
Systems that detect and adapt to learners' cognitive or emotional states
Tools for learners with special needs
http://www.cs.rochester.edu/~tetreaul/naacl-bea7.h...
Use of corpora in educational tools
Data mining of learner and other corpora for tool building
Annotation standards and schemas / annotator agreement
Tools and applications for classroom teachers and/or test developers
NLP tools for second and foreign language learners
Semantic-based access to instructional materials to identify appropriate texts
Tools that automatically generate test questions
Processing of and access to lecture materials across topics and genres
Adaptation of instructional text to individual learners' grade levels
Tools for text-based curriculum development
E-learning tools for personalized course content
Language-based educational games
Issues concerning the evaluation of NLP-based educational tools
Descriptions of implemented systems
Descriptions and proposals for shared tasks
In this workshop, we solicit papers from all subfields: automated scoring, intelligent tutoring, learner cognition, use of corpora, grammatical error detection, and tools for teachers and test developers. Since the first workshop in 1997, "Innovative Use of NLP in Building Educational Applications" has continued to bring together all NLP subfields to foster interaction and collaboration among researchers in both academic institutions and industry. The workshop offers a venue for researchers to present and discuss their work in these areas. Each year, we see steady growth in workshop submissions and attendance, and the research has become more innovative and advanced. In 2012, we expect that the workshop (consistent with previous workshops at ACL 1997, NAACL/HLT 2003, ACL 2005, ACL 2008, NAACL/HLT 2009, NAACL/HLT 2010, and ACL 2011), will continue to expose the NLP research community to technologies that identify novel opportunities for the use of NLP techniques and tools in educational applications. At ACL 2011, the workshop introduced a poster session that was lively and well-attended. We plan to continue to have poster sessions as a regular feature.
The practical need for language-analysis capabilities has been driven by increased requirements for state and national assessments, and a growing population of foreign and second language learners. There are currently a number of commercial systems that handle automated scoring of free-text and speech in the context of assessment as well as systems that address linguistic complexity in text - commonly referred to as readability measures. More recently, the need for applications for language analysis is emphasized by a new influence in the educational landscape in the United States, specifically, the Common Core State Standards initiative: (http://www.corestandards.org/) that is coordinated by the National Governors Association Center for Best Practices and the Council of Chief State School Officers. The initiative has now been adopted by 46 states for use in Kindergarten through 12th grade (K-12) classrooms. This initiative is likely to have a strong influence on teaching standards in K-12 education. The Common Core standards describe what K-12 students should be learning with regard to Reading, Writing, Speaking, Listening, Language, and Media and Technology. In addition, the Common Core recently released a Publishers Criteria document that describes the array of linguistic elements that learners need to grasp as they progress to the higher grades (http://www.corestandards.org/assets/Publishers_Cri...). The Common Core thereby introduces language analysis scenarios that have clear alignments with NLP research and applications.
The workshop will solicit both full papers and short papers for either oral or poster presentation. This year, the Helping Our Own (HOO-2) Shared Task on grammatical error detection will be co-located with the BEA7 workshop.
Given the broad scope of the workshop, we organize the workshop around three central themes in the educational infrastructure:
Development of curriculum and assessment (e.g., applications that help teachers develop reading materials)
Delivery of curriculum and assessments (e.g., applications where the student receives instruction and interacts with the system);
Deporting of assessment outcomes (e.g., automated scoring of free responses)
Topics will include, but will not be limited to, the following:
Automated scoring/evaluation for oral and written student responses
Content analysis for scoring/assessment
Grammatical error detection and correction
Discourse and stylistic analysis
Plagiarism detection
Machine translation for assessment, instruction and curriculum development
Detection of non-literal language (e.g., metaphor)
Sentiment analysis
Intelligent Tutoring (IT) that incorporates state-of-the-art NLP methods
Dialogue systems in education
Hypothesis formation and testing
Multi-modal communication between students and computers
Generation of tutorial responses
Knowledge representation in learning systems
Concept visualization in learning systems
Learner Cognition
Assessment of learners' language and cognitive skill levels
Systems that detect and adapt to learners' cognitive or emotional states
Tools for learners with special needs
http://www.cs.rochester.edu/~tetreaul/naacl-bea7.h...
Use of corpora in educational tools
Data mining of learner and other corpora for tool building
Annotation standards and schemas / annotator agreement
Tools and applications for classroom teachers and/or test developers
NLP tools for second and foreign language learners
Semantic-based access to instructional materials to identify appropriate texts
Tools that automatically generate test questions
Processing of and access to lecture materials across topics and genres
Adaptation of instructional text to individual learners' grade levels
Tools for text-based curriculum development
E-learning tools for personalized course content
Language-based educational games
Issues concerning the evaluation of NLP-based educational tools
Descriptions of implemented systems
Descriptions and proposals for shared tasks
Other CFPs
Last modified: 2011-12-31 20:59:28