WDDD 2011 - WDDD: 4th Workshop on Duplicating, Deconstructing, and Debunking
Date2011-06-05
Deadline2011-04-11
VenueSan Jose, USA - United States
Keywords
Websitehttps://isca2011.umaine.edu
Topics/Call fo Papers
WDDD provides the computer systems research community a forum for work that validates or duplicates earlier results; deconstructs prior findings by providing greater, in-depth insight into causal relationships or correlations; or debunks earlier findings by describing precisely how and why proposed techniques fail where earlier successes were claimed, or succeed where failure was reported.
Traditionally, computer systems research conferences have focused almost exclusively on novelty and performance, neglecting an abundance of interesting work that lacks one or both of these attributes. A significant part of research--in fact, the backbone of the scientific method--involves independent validation of existing work and the exploration of strange ideas that never pan out. This workshop provides a venue for disseminating such work in our community. Published validation experiments strengthen existing work, while thorough comparisons provide new dimensions and perspectives. Studies that refute or correct existing work also strengthen the research community, by ensuring that published material is technically correct and has sound assumptions. Publishing negative or strange or unexpected results will allow future researchers to learn the hard lessons of others, without repeating their effort.
This workshop will set a high scientific standard for such experiments, and will require insightful analysis to justify all conclusions. The workshop will favor submissions that provide meaningful insights, and identify underlying root causes for the failure or success of the investigated technique. Acceptable work must thoroughly investigate and communicate why the proposed technique performs as the results indicate. Rebuttals may be invited for debunking submissions.
Submission Topics
Independent validation of earlier results with meaningful analysis
In-depth analysis and sensitivity studies that provide further insight into earlier findings, or identify key parameters or assumptions that affect the results
Studies that refute earlier findings, with clear justification and explanation
Negative results for ideas that intuitively make sense and should work, along with explanations for why they do not
Validation/refutation of controversial advertising cliams by industrial competitors
Traditionally, computer systems research conferences have focused almost exclusively on novelty and performance, neglecting an abundance of interesting work that lacks one or both of these attributes. A significant part of research--in fact, the backbone of the scientific method--involves independent validation of existing work and the exploration of strange ideas that never pan out. This workshop provides a venue for disseminating such work in our community. Published validation experiments strengthen existing work, while thorough comparisons provide new dimensions and perspectives. Studies that refute or correct existing work also strengthen the research community, by ensuring that published material is technically correct and has sound assumptions. Publishing negative or strange or unexpected results will allow future researchers to learn the hard lessons of others, without repeating their effort.
This workshop will set a high scientific standard for such experiments, and will require insightful analysis to justify all conclusions. The workshop will favor submissions that provide meaningful insights, and identify underlying root causes for the failure or success of the investigated technique. Acceptable work must thoroughly investigate and communicate why the proposed technique performs as the results indicate. Rebuttals may be invited for debunking submissions.
Submission Topics
Independent validation of earlier results with meaningful analysis
In-depth analysis and sensitivity studies that provide further insight into earlier findings, or identify key parameters or assumptions that affect the results
Studies that refute earlier findings, with clear justification and explanation
Negative results for ideas that intuitively make sense and should work, along with explanations for why they do not
Validation/refutation of controversial advertising cliams by industrial competitors
Other CFPs
- MoBS-7: 7th Workshop on Modeling, Benchmarking, and Simulation
- WEED: 3rd Workshop on Energy-Efficient Design
- PESPMA: 4th Workshop on Parallel Execution of Sequential Programs on Multi-core Architectures
- 11th Diverse Annual Conference
- 2nd International Workshop on Multimedia Communications and Networking (MultiCom 2012 )
Last modified: 2011-04-13 14:34:57