2011 - Spelling Alteration for Web Search Workshop
Topics/Call fo Papers
Spelling Alteration for Web Search Workshop
July 19, 2011
Bellevue, WA, USA
http://spellerchallenge.com/Workshop.aspx
Paper Submission deadline: June 25, 2011
In December 2010, Microsoft Research (MSR) and Bing jointly announced the first MSR-Bing challenge with the topic on web scale search query spelling correction. The challenge required participants to submit their entries as publicly accessible web services that meet certain latency requirements so that all the participating systems can be studied and their effectiveness evaluated by any applicable datasets.
We now invite the whole research community to submit papers to the upcoming Spelling Alteration for Web Search Workshop, which will take place on July 19, in Bellevue, WA.
The Spelling Alteration for Web Search workshop addresses the challenges of web scale Natural Language Processing with a focus on future research directions on spelling alteration for web search. The participants in the Speller Challenge (details at http://www.spellerchallenge.com) are also encouraged to participate to exchange ideas and share their experience with the rest of the research community.
Although the workshop is not limited to the Speller Challenge participants, we encourage the on-demand evaluation web application and the referenced testing dataset to be included in the system evaluation of the prospective workshop submissions so that different systems and approaches can be compared on the same benchmark. We also encourage the submitted systems be made publicly accessible throughout the workshop to facilitate live demonstrations.
Furthermore, the participants will be able to use the resources below to test new ideas against the published benchmarks presented in the workshop.
? On-demand evaluation web application: The challenge makes available a web application that can be invoked on demand to evaluate a spelling correction system conforming to the web service interface defined by the challenge (published in http://www.spellerchallenge.com). The evaluation program utilizes a standard dataset comprised of search queries received by Bing in the EN-US market with manual annotation for typographical errors.
? Web scale multi-style language models and Contextual similarity dataset: two web services based on Bing’s web snapshots of June 2009 and April 2010 containing (1) the language models derived from web document body, the title, and the anchor text and (2) web document terms appearing in similar lexical contexts where many spelling errors can be observed.
? Spelling correction development dataset: A query dataset based on TREC million-query track manually annotated with spelling corrections.
If you are interested submit your paper here: https://cmt.research.microsoft.com/SC2011/Default....
July 19, 2011
Bellevue, WA, USA
http://spellerchallenge.com/Workshop.aspx
Paper Submission deadline: June 25, 2011
In December 2010, Microsoft Research (MSR) and Bing jointly announced the first MSR-Bing challenge with the topic on web scale search query spelling correction. The challenge required participants to submit their entries as publicly accessible web services that meet certain latency requirements so that all the participating systems can be studied and their effectiveness evaluated by any applicable datasets.
We now invite the whole research community to submit papers to the upcoming Spelling Alteration for Web Search Workshop, which will take place on July 19, in Bellevue, WA.
The Spelling Alteration for Web Search workshop addresses the challenges of web scale Natural Language Processing with a focus on future research directions on spelling alteration for web search. The participants in the Speller Challenge (details at http://www.spellerchallenge.com) are also encouraged to participate to exchange ideas and share their experience with the rest of the research community.
Although the workshop is not limited to the Speller Challenge participants, we encourage the on-demand evaluation web application and the referenced testing dataset to be included in the system evaluation of the prospective workshop submissions so that different systems and approaches can be compared on the same benchmark. We also encourage the submitted systems be made publicly accessible throughout the workshop to facilitate live demonstrations.
Furthermore, the participants will be able to use the resources below to test new ideas against the published benchmarks presented in the workshop.
? On-demand evaluation web application: The challenge makes available a web application that can be invoked on demand to evaluate a spelling correction system conforming to the web service interface defined by the challenge (published in http://www.spellerchallenge.com). The evaluation program utilizes a standard dataset comprised of search queries received by Bing in the EN-US market with manual annotation for typographical errors.
? Web scale multi-style language models and Contextual similarity dataset: two web services based on Bing’s web snapshots of June 2009 and April 2010 containing (1) the language models derived from web document body, the title, and the anchor text and (2) web document terms appearing in similar lexical contexts where many spelling errors can be observed.
? Spelling correction development dataset: A query dataset based on TREC million-query track manually annotated with spelling corrections.
If you are interested submit your paper here: https://cmt.research.microsoft.com/SC2011/Default....
Other CFPs
- 23rd IEEE ICTAI - Special Track on Recommender Systems in e-Commerce (RSEC2011)
- 12th International Conference on Parallel Problem Solving From Nature
- First International Online Student Conference on Computer Science
- Ninth International Conference on Simulated Evolution And Learning (SEAL'2012)
- International workshop on Security and Dependability for Resource Constrained Embedded Systems
Last modified: 2011-06-09 22:32:21