ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Multi-task 2014 - Workshop Transfer and Multi-task learning: Theory Meets Practice

Date2014-12-12

Deadline2014-10-10

VenueQuebec, Canada Canada

Keywords

Websitehttp://sites.google.com/site/multitaskwsnips2014

Topics/Call fo Papers

NIPS 2014 Workshop
Transfer and Multi-task learning: Theory Meets Practice
December 12 2014
Montréal, Quebec Canada
https://sites.google.com/site/multitaskwsnips2014/
Transfer, domain adaptation and multi-task learning methods have been developed to better exploit the available data at training time, originally moved by the need to deal with a reduced amount of information. Nowadays, gathering data is much easier than in the past thanks to the low price of different acquisition devices (e.g. cameras) and to the World Wide Web that connects million of device users. Existing methods must embrace new challenges to manage large scale data that do not lack anymore in size but may lack in quality or may continuously change over time.
After the first workshop edition where we investigated new directions for learning across domains, the NIPS 2014 workshop on Transfer and Multi-task learning focuses on theory and practical applications such as large scale learning, natural language processing, computational biology, and deep learning. We will incorporate five invited talks, spotlight presentations of novel contributions, and poster sessions. We generally call for extended abstracts on the topics of interest for the workshop, namely:
Learning the task similarities/dissimilarities from large amount of data
Large scale domain adaptation and dataset bias: aligning multiple existing but partially related data collections.
Limitations of multi-task learning when the number of tasks (T) grows while each task is described by only a small (n) amount of samples (“big T, small n”)
Leveraging over noisy data gathered from the Internet as a reference for a new task
Life-long learning of a continuous stream of information: incremental, online and active transfer for open-ended learning
Deep Learning in domain adaptation, transfer and multi-task applications
Innovative applications e.g., natural language processing tasks such as machine translation and syntactic or semantic parsing
Theory meets practice e.g., in computer vision and computational biology
Multi-task and transfer in reinforcement learning
Innovative regularization strategies
Domain adaptation theory
Preference will be given to manuscripts which propose new principled methods for the listed topics or which are likely to generate new debate by rising general issues or suggesting directions for future work. Submissions should be written as extended abstracts, no longer than 4 pages in the NIPS LaTex style (plus references). The extended abstract may be accompanied by an unlimited appendix and other supplementary material, with the understanding that anything beyond 4 pages may be ignored by the program committee. Style files and formatting instructions can be found at http://nips.cc/PaperInformation/StyleFiles.
The papers should be submitted before October 10 via https://cmt.research.microsoft.com/MLWS2014. Notifications will be given on October 24. Topics that were recently published or presented elsewhere are allowed, provided that the extended abstract mentions this explicitly.
Important Dates
Submission deadline: October 10th, 2014
Acceptance decision: October 24th, 2014
Workshop: December 12th or 13th, 2014
Invited Speakers
Yoshua Bengio
Hal Daumé III
Christoph Lampert
Gunnar Rätsch
Mehryar Mohri
Organizers
Urun Dogan - Skype Labs / Microsoft
Marius Kloft - Humboldt University of Berlin
Andres Munoz - Courant Institute of Mathematical Sciences
Francesco Orabona - Toyota Technological Institute, Chicago
Tatiana Tommasi - KU Leuven

Last modified: 2014-08-29 21:48:24