ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

2012 - IEEE Trans. Neural Networks Special Issue: Online Learning in Kernel Methods

Date2012-03-31

Deadline2011-05-01

VenueCall for p, Afghanistan Afghanistan

Keywords

Website

Topics/Call fo Papers

IEEE Trans. Neural Networks Special Issue: Online Learning in Kernel
Methods

Online learning is one of the most powerful and commonly used
techniques for training adaptive filters and has been used
successfully in neural networks. The last decade has also witnessed a
flurry of research efforts in Mercer kernel methods, such as the SVM
and kernel regression, kernel principal component analysis etc. All
these techniques use algorithms that have to work with a large matrix
(the Gram Matrix) which makes them computational and memory intensive.
It is theoretically possible to arrive at the neighborhood of the
optimal solution using gradient descent techniques, with simpler and
less memory intensive algorithms. There are already important
algorithms in the literature that propose online learning with kernels
such as resource allocating networks, growing and pruning radial basis
function networks, kernel recursive least-squares algorithms, kernel
least-mean-square algorithms, kernel affine projection algorithms,
etc. These advances are slowly evolving into a new adaptive system
theory that can be encapsulated under the name of Online Learning in
Kernel Methods (OLKM). OLKM uses Mercer kernels for nonlinear mapping
of the input space into a hidden space of high dimensionality and uses
linear adaptive structures (filters, regressors, classifiers) for
accommodating the adaptive requirement. In so doing, it preserves the
conceptual simplicity of linear adaptive filters (no local minima),
and inherits the rich expressiveness from kernel methods (universal
approximation property). Although OLKM has found applications in
signal processing, pattern recognition, data mining, informational
retrieval, and demand forecasting, the theory itself is far from
complete. This special issue intends to attract papers that advance
the mathematical foundations, the application and understanding of
these methodologies.

We invite original and unpublished research contributions in all areas
relevant to online learning with kernels. The papers will present
original work or review state-of-the-art approaches that summarize the
recent advances in the following non-exhaustive list of topics:

? Online learning for kernel adaptive systems
? Kernelization of online learning techniques
? Optimization, growing and pruning techniques and kernel design for
online kernel learning
? Information theoretic learning principles in kernel adaptive systems
? Multidimensional kernel adaptive systems (complex, quaternion, and
multichannel)
? Convergence, steady-state and error bound analysis of online kernel
algorithms
? New applications of online learning with kernels

Prospective authors should visit http://ieee-cis.org/pubs/tnn/papers/
for information on paper submission. Manuscripts should be submitted
using the Manuscript Central system at http://mc.manuscriptcentral.com/tnn.
On the first page of the manuscript as well as on the cover letter,
indicate clearly that the manuscript is submitted to the TNN Special
Issue: Online Kernel Learning. Manuscripts will be peer reviewed
according to the standard IEEE process.

Manuscript submission due: May 1, 2011
First review completed: October 1, 2011
Revised manuscript due: December 1, 2011
Second review completed: March 1, 2012
Final manuscript due: March 31, 2012

Guest editors:
Dr. Jose C. Principe, University of Florida, USA,
principe-AT-cnel.ufl.edu
Dr. Seiichi Ozawa, Kobe University, Japan, ozawasei-AT-kobe-u.ac.jp
Dr. Sergios Theodoridis, University of Athens, Greece,
stheodor-AT-di.uoa.gr
Dr. Tülay Adali, University of Maryland, Baltimore County, USA,
adali-AT-umbc.edu
Dr. Danilo P. Mandic, Imperial College London, UK,
d.mandic-AT-imperial.ac.uk
Dr. Weifeng Liu, Amazon.com, USA, weifeng-AT-ieee.org

Last modified: 2010-12-01 09:58:26