ALK 2016 - Special session on Advances in Learning with Kernels: Theory and Practice in a World of growing Constraints
Topics/Call fo Papers
Kernel methods consistently outperformed previous generations of learning techniques. They provide a flexible and expressive learning framework that has been successfully applied to a wide range of real world problems but, recently, novel algorithms, such as Deep Neural Networks and Ensemble Methods, have increased their competitiveness against them.
Due to the current data growth in size, heterogeneity and structure, the new generation of algorithms are expected to solve increasingly challenging problems. This must be done under growing constraints such as computational resources, memory budget and energy consumption. For these reasons, new ideas have to come up in the field of kernel learning, such as deeper kernels and novel algorithms, to fill the gap that now exists with the most recent learning paradigms.
The purpose of this special session is to highlight recent advances in learning with kernels. In particular, this session welcomes contributions toward the solution of the weaknesses (e.g. scalability, computational efficiency and too shallow kernels) and the improvement of the strengths (e.g. the ability of dealing with structural data) of the state of the art kernel methods. We also encourage the submission of new theoretical results in the Statistical Learning Theory framework and innovative solutions to real world problems.
In particular, topics of interest include, but are not limited to:
Budget (time, memory, energy) Learning
Structured input and output (e.g. graph/tree kernels)
Structural Features and Sparse Feature Spaces
Feature learning, weighting and ranking
Large Scale Kernel Methods
Statistical analysis and generalization bounds
Multiple Kernel Learning
Mixed Hard/Soft Constraints
Kernel complexity
Deeper Kernels
Novel Kernelized Algorithms (e.g. online learning, preference learning)
Applications to relevant Real-World Problems
Due to the current data growth in size, heterogeneity and structure, the new generation of algorithms are expected to solve increasingly challenging problems. This must be done under growing constraints such as computational resources, memory budget and energy consumption. For these reasons, new ideas have to come up in the field of kernel learning, such as deeper kernels and novel algorithms, to fill the gap that now exists with the most recent learning paradigms.
The purpose of this special session is to highlight recent advances in learning with kernels. In particular, this session welcomes contributions toward the solution of the weaknesses (e.g. scalability, computational efficiency and too shallow kernels) and the improvement of the strengths (e.g. the ability of dealing with structural data) of the state of the art kernel methods. We also encourage the submission of new theoretical results in the Statistical Learning Theory framework and innovative solutions to real world problems.
In particular, topics of interest include, but are not limited to:
Budget (time, memory, energy) Learning
Structured input and output (e.g. graph/tree kernels)
Structural Features and Sparse Feature Spaces
Feature learning, weighting and ranking
Large Scale Kernel Methods
Statistical analysis and generalization bounds
Multiple Kernel Learning
Mixed Hard/Soft Constraints
Kernel complexity
Deeper Kernels
Novel Kernelized Algorithms (e.g. online learning, preference learning)
Applications to relevant Real-World Problems
Other CFPs
- Special session on Physics and Machine Learning: Emerging Paradigms
- Special session on Incremental learning algorithms and applications
- Special session on Information Visualisation and Machine Learning: Techniques, Validation and Integration
- Special session on Machine learning for medical applications
- 24 th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
Last modified: 2015-08-14 21:13:59