ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Non-convex 2015 - 2015 Workshop Non-convex Optimization for Machine Learning: Theory and Practice

Date2015-12-11

Deadline2015-10-20

VenueMontreal, Canada Canada

Keywords

Websitehttps://sites.google.com/site/nips2015no...

Topics/Call fo Papers

NIPS 2015 Workshop
Non-convex Optimization for Machine Learning: Theory and Practice
December 12th, 2015
Montreal, Canada
https://sites.google.com/site/nips2015nonconvexopt...
Important Dates:
Submission deadline: October 20, 2015
Acceptance notification: October 25, 2015
Workshop: December 12, 2015
###
WORKSHOP OVERVIEW
Non-convex optimization is ubiquitous in machine learning. In general, reaching the global optima of these problems is NP-hard and in practice, local search methods such as gradient descent can get stuck in spurious local optima and suffer from poor convergence. Over the last few years, tremendous progress has been made in establishing theoretical guarantees for many of the non-convex optimization problems. While there are worst-case instances which are computationally hard to solve, focus has shifted in characterizing transparent conditions for cases which are tractable. In many instances, these conditions turn out to be mild and natural for machine learning applications.
There are certainly many challenging open problems in the area of non-convex optimization. While guarantees have been established in individual instances, there is no common unifying theme of what makes non-convex problem tractable. Many challenging instances such as optimization for training multi-layer neural networks or analyzing novel regularization techniques such as dropout for non-convex optimization still remain wide open. On the practical side, conversations between theorists and practitioners can help identify what kind of conditions are reasonable for specific applications, and thus lead to the design of practically motivated algorithms for non-convex optimization with rigorous guarantees.
This workshop will fill a very important gap in bringing researchers from disparate communities and bridging the gap between theoreticians and practitioners. To facilitate discussion between theorists and practitioners, we aim to make the workshop easily accessible to people currently unfamiliar with the intricate details of these methods.
###
SUBMISSIONS
We welcome submissions to the workshop in topics of interest which include but are not limited to
- spectral learning for matrices and tensors
- dictionary learning
- online algorithms and advances in stochastic gradient descent for
- non-convex optimization
- application of non-convex optimization to diverse application domains such as natural language processing, social networks, health informatics, and biological sequence analysis.
Papers submitted to the workshop should be up to four pages long excluding references and in NIPS 2015 format. They should be sent by email to "nips15nco gmail com". Accepted submissions will be presented as posters.
Open Problems:
We also welcome open problems in the general area of non-convex optimization of interest to the NIPS community.
###
INVITED SPEAKERS
Sanjeev Arora [Princeton]
Kevin C. Chen [Rutgers]
Yann LeCun [Facebook]
Hossein Mobahi [MIT]
Chris Re [Stanford]
Greg Valiant [Stanford]
WORKSHOP ORGANIZERS
Animashree Anandkumar [UC Irvine]
Kamalika Chaudhuri [UC Sandiego]
Percy Liang [Stanford]
Sewoong Oh [UIUC]
U N Niranjan [UC Irvine]

Last modified: 2015-10-13 23:44:30