DISCML 2013 - 5th Workshop on Discrete and Combinatorial Problems in Machine Learning (DISCML)
Date2013-12-09
Deadline2013-10-09
VenueLake Tahoe, USA - United States
Keywords
Websitehttps://www.discml.cc
Topics/Call fo Papers
5th Workshop on
Discrete and Combinatorial Problems in Machine Learning (DISCML)
at the
Annual Conference on Neural Information Processing Systems (NIPS 2013)
Lake Tahoe
http://www.discml.cc
Submission Deadline: 9th October 2013
(max. 6 pages, NIPS format)
==============================================================
Combinatorial structures and optimization problems with discrete
solutions are becoming a core component of machine learning. When
aiming to process larger quantities of more complex and data, one may
quickly find oneself working with graphs, relations, partitions, or
sparsity structures. Or we may want to predict structured, sparse
estimators, or combinatorial objects such as permutations, trees or
other graphs, group structure and so on.
While complex structures enable much richer applications, they often
come with the caveat that the related learning and inference problems
become computationally very hard. When scaling to large data,
combinatorial problems also add challenges to compact representation,
streaming, and distributed computation.
Despite discouraging theoretical worst-case results, many practically
interesting problems can be much more well behaved (when modeled
appropriately). Beneficial properties are most often structural and
include symmetry, exchangeability, sparsity, or submodularity. The
DISCML workshop revolves around such structures in machine learning,
their theory and applications.
The workshop will feature invited keynote lectures as well as contributed
spotlight talks and posters.
We would like to invite high-quality submissions that present
* recent results related to discrete and combinatorial problems in
machine learning, or
* open problems, controversial questions and observations.
Areas of interest include (but are not limited to)
* learning and optimization (combinatorial algorithms, submodular
optimization, discrete convex analysis, pseudo-boolean optimization,
online learning, structure learning),
* continuous relaxations (sparse reconstruction, regularization)
* combinatorics and big data (streaming, sketching, subset selection,
parallel and distributed combinatorial algorithms)
* random combinatorial structures and combinatorial stochastic processes
* applications, e.g. combinatorial approaches to information retrieval,
speech and natural language processing, computer vision, or bioinformatics.
Submissions:
NIPS 2013 format (length max. 6 pages) to submitdiscml.cc .
Deadline: October 9, 2013.
Organizers:
Stefanie Jegelka (UC Berkeley),
Andreas Krause (ETH Zurich, Switzerland),
Jeff A. Bilmes (University of Washington, Seattle),
Pradeep Ravikumar (University of Texas, Austin)
Discrete and Combinatorial Problems in Machine Learning (DISCML)
at the
Annual Conference on Neural Information Processing Systems (NIPS 2013)
Lake Tahoe
http://www.discml.cc
Submission Deadline: 9th October 2013
(max. 6 pages, NIPS format)
==============================================================
Combinatorial structures and optimization problems with discrete
solutions are becoming a core component of machine learning. When
aiming to process larger quantities of more complex and data, one may
quickly find oneself working with graphs, relations, partitions, or
sparsity structures. Or we may want to predict structured, sparse
estimators, or combinatorial objects such as permutations, trees or
other graphs, group structure and so on.
While complex structures enable much richer applications, they often
come with the caveat that the related learning and inference problems
become computationally very hard. When scaling to large data,
combinatorial problems also add challenges to compact representation,
streaming, and distributed computation.
Despite discouraging theoretical worst-case results, many practically
interesting problems can be much more well behaved (when modeled
appropriately). Beneficial properties are most often structural and
include symmetry, exchangeability, sparsity, or submodularity. The
DISCML workshop revolves around such structures in machine learning,
their theory and applications.
The workshop will feature invited keynote lectures as well as contributed
spotlight talks and posters.
We would like to invite high-quality submissions that present
* recent results related to discrete and combinatorial problems in
machine learning, or
* open problems, controversial questions and observations.
Areas of interest include (but are not limited to)
* learning and optimization (combinatorial algorithms, submodular
optimization, discrete convex analysis, pseudo-boolean optimization,
online learning, structure learning),
* continuous relaxations (sparse reconstruction, regularization)
* combinatorics and big data (streaming, sketching, subset selection,
parallel and distributed combinatorial algorithms)
* random combinatorial structures and combinatorial stochastic processes
* applications, e.g. combinatorial approaches to information retrieval,
speech and natural language processing, computer vision, or bioinformatics.
Submissions:
NIPS 2013 format (length max. 6 pages) to submitdiscml.cc .
Deadline: October 9, 2013.
Organizers:
Stefanie Jegelka (UC Berkeley),
Andreas Krause (ETH Zurich, Switzerland),
Jeff A. Bilmes (University of Washington, Seattle),
Pradeep Ravikumar (University of Texas, Austin)
Other CFPs
- STOC 2015: 47th ACM Symposium on Theory of Computing
- 27th ACM Symposium on Parallelism in Algorithms and Architectures
- SIGMETRICS 2015: International Conference on Measurement and Modeling of Computer Systems
- 36th ACM SIGPLAN Conference on Programming Language Design and Implementation
- LCTES 2015: ACM SIGPLAN/SIGBED International Conference on Languages, Compilers and Tools for Embedded Systems
Last modified: 2013-09-13 22:37:06