CrowdScale 2013 - Crowdsourcing at ScaleA workshop
Date2013-11-09
Deadline2013-10-14
VenueCalifornia , USA - United States
Keywords
Websitehttps://www.crowdscale.org
Topics/Call fo Papers
CrowdScale 2013: Crowdsourcing at ScaleA workshop at HCOMP 2013: The 1st Conference on Human Computation & Crowdsourcing
November 9, 2013
http://www.crowdscale.org
Overview
--------
Crowdsourcing and human computation at scale raises a variety of open challenges vs. crowdsourcing with smaller workloads and labor pools. We believe focusing on such issues of scale will be key to taking crowdsourcing to the next level ? from its uptake by early adopters today, to its future as how the world’s work gets done. To advance crowdsourcing at scale, CrowdScale will pursue two thrusts:
*Track 1: Position Papers*. We invite submission of 2-page position
papers which identify and motivate focused, key problems or approaches
for crowdsourcing at scale.
*Track 2: Shared Task Challenge*. We invite submissions to a shared task challenge on computing consensus from crowds: how to generate the best possible answer for each question, based on the judgments of five or more raters per question. Participants will submit 4-page papers describing their systems and preliminary results, with *$1500* in prize money awarded to top performers.
One may participate in either or both tracks. Submitted papers will not be peer-reviewed or archived, so work shared in these papers can be later submitted to peer-reviewed venues. All papers will be posted on the workshop website to promote discussion within and beyond workshop participants. Workshop organizers will review all submissions to ensure quality, with high acceptance expected.
Position Papers
http://www.crowdscale.org/cfp
---------------
We invite submission of short (2-page) position papers which identify
and motivate key problems or potential approaches for crowdsourcing at
scale. We encourage submissions identifying and clearly articulating
problems, even if there aren’t satisfactory solutions proposed.
Submissions focusing on problems should a clearly describe a problem of
scale, why it matters, why it is hard, existing approaches, and desired
properties of effective solutions. We welcome early work, and
particularly encourage submission of visionary position papers that are
forward looking.
Each submitted paper should focus on one problem. We encourage multiple
submissions per author for articulating distinct problem statements or
methods.
During the workshop, authors will self-organize into break-out groups,
with each group further elaborating upon a particular critical area
meriting further work and study. Each group will summarize and report
its findings at the workshop’s close. In continuing discussion beyond
the workshop, organizers and participants will co-author a summary paper articulating a road map of important challenges and approaches for our community to pursue.
Position paper ideas include (but are not limited to):
http://www.crowdscale.org/position-paper-ideas
Shared Task Challenge
http://www.crowdscale.org/shared-task
---------------------
To help advance research on crowdsourcing at scale, CrowdFlower and
Google are sharing two new, large challenge datasets for multi-class
classification. Both datasets are available for immediate download. To
make it easy to participate, we have provided multiple formats of the
data, and pointers to open source software online that is available to
get started.
All participants are expected to submit a paper (up to 4 pages)
describing one’s method and preliminary results on shared task metrics,
and to present a poster at the workshop. Final results will be announced at the workshop, with prize money awarded to the best performer(s), as well as recognition during the workshop and in our workshop report.
Shared task participants are also invited to participate in workshop
discussion throughout the day.
*Important Dates*
-----------------
October 14: Position papers due
October 20: Shared task runs due
October 27: Shared task papers due
November 9: Workshop
Please see workshop website for additional information on schedule.
*Questions*: Email the organizers at: crowdscale-organizers-AT-googlegroups.com
Workshop Organizers
-------------------
Tatiana Josephy, CrowdFlower
Matthew Lease, University of Texas at Austin
Praveen Paritosh, Google
Advisory Committee
-------------------
Omar Alonso, Microsoft
Ed Chi, Google
Lydia Chilton, University of Washington
Matt Cooper, oDesk
Peng Dai, Google
Benjamin Goldenberg, Yelp
David Huynh, Google
Panos Ipeirotis, Google/NYU
Chris Lintott, Zooniverse/GalaxyZoo
Greg Little, oDesk
Stuart Lynn, Zooniverse/GalaxyZoo
Stefano Mazzocchi, Google
Rajesh Patel, Microsoft
Mike Shwe, Google
Rion Snow, Twitter
Maria Stone, Microsoft
Alexander Sorokin, CrowdFlower
Jamie Taylor, Google
Tamsyn Waterhouse, Google
Patrick Philips, LinkedIn
Sanga Reddy Peerreddy, SetuServ
November 9, 2013
http://www.crowdscale.org
Overview
--------
Crowdsourcing and human computation at scale raises a variety of open challenges vs. crowdsourcing with smaller workloads and labor pools. We believe focusing on such issues of scale will be key to taking crowdsourcing to the next level ? from its uptake by early adopters today, to its future as how the world’s work gets done. To advance crowdsourcing at scale, CrowdScale will pursue two thrusts:
*Track 1: Position Papers*. We invite submission of 2-page position
papers which identify and motivate focused, key problems or approaches
for crowdsourcing at scale.
*Track 2: Shared Task Challenge*. We invite submissions to a shared task challenge on computing consensus from crowds: how to generate the best possible answer for each question, based on the judgments of five or more raters per question. Participants will submit 4-page papers describing their systems and preliminary results, with *$1500* in prize money awarded to top performers.
One may participate in either or both tracks. Submitted papers will not be peer-reviewed or archived, so work shared in these papers can be later submitted to peer-reviewed venues. All papers will be posted on the workshop website to promote discussion within and beyond workshop participants. Workshop organizers will review all submissions to ensure quality, with high acceptance expected.
Position Papers
http://www.crowdscale.org/cfp
---------------
We invite submission of short (2-page) position papers which identify
and motivate key problems or potential approaches for crowdsourcing at
scale. We encourage submissions identifying and clearly articulating
problems, even if there aren’t satisfactory solutions proposed.
Submissions focusing on problems should a clearly describe a problem of
scale, why it matters, why it is hard, existing approaches, and desired
properties of effective solutions. We welcome early work, and
particularly encourage submission of visionary position papers that are
forward looking.
Each submitted paper should focus on one problem. We encourage multiple
submissions per author for articulating distinct problem statements or
methods.
During the workshop, authors will self-organize into break-out groups,
with each group further elaborating upon a particular critical area
meriting further work and study. Each group will summarize and report
its findings at the workshop’s close. In continuing discussion beyond
the workshop, organizers and participants will co-author a summary paper articulating a road map of important challenges and approaches for our community to pursue.
Position paper ideas include (but are not limited to):
http://www.crowdscale.org/position-paper-ideas
Shared Task Challenge
http://www.crowdscale.org/shared-task
---------------------
To help advance research on crowdsourcing at scale, CrowdFlower and
Google are sharing two new, large challenge datasets for multi-class
classification. Both datasets are available for immediate download. To
make it easy to participate, we have provided multiple formats of the
data, and pointers to open source software online that is available to
get started.
All participants are expected to submit a paper (up to 4 pages)
describing one’s method and preliminary results on shared task metrics,
and to present a poster at the workshop. Final results will be announced at the workshop, with prize money awarded to the best performer(s), as well as recognition during the workshop and in our workshop report.
Shared task participants are also invited to participate in workshop
discussion throughout the day.
*Important Dates*
-----------------
October 14: Position papers due
October 20: Shared task runs due
October 27: Shared task papers due
November 9: Workshop
Please see workshop website for additional information on schedule.
*Questions*: Email the organizers at: crowdscale-organizers-AT-googlegroups.com
Workshop Organizers
-------------------
Tatiana Josephy, CrowdFlower
Matthew Lease, University of Texas at Austin
Praveen Paritosh, Google
Advisory Committee
-------------------
Omar Alonso, Microsoft
Ed Chi, Google
Lydia Chilton, University of Washington
Matt Cooper, oDesk
Peng Dai, Google
Benjamin Goldenberg, Yelp
David Huynh, Google
Panos Ipeirotis, Google/NYU
Chris Lintott, Zooniverse/GalaxyZoo
Greg Little, oDesk
Stuart Lynn, Zooniverse/GalaxyZoo
Stefano Mazzocchi, Google
Rajesh Patel, Microsoft
Mike Shwe, Google
Rion Snow, Twitter
Maria Stone, Microsoft
Alexander Sorokin, CrowdFlower
Jamie Taylor, Google
Tamsyn Waterhouse, Google
Patrick Philips, LinkedIn
Sanga Reddy Peerreddy, SetuServ
Other CFPs
Last modified: 2013-10-09 22:52:51