ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

2021 - Workshop on Computational Aspects of Deep Learning at ICPR 2020

Date2021-01-10 - 2021-01-15

Deadline2020-10-10

VenueMilan, Italy Italy

Keywords

Websitehttps://ailb-web.ing.unimore.it/cadl2020

Topics/Call fo Papers

AIMS AND SCOPE
===
Deep Learning has been the most significant breakthrough in the past 10 years: it has radically changed the research methodology towards a data-oriented approach, in which learning involves all steps of the prediction pipeline. In this context, optimization and careful design of neural architectures play an increasingly important role which directly affects the research pace, the effectiveness of state-of-the-art models and their applicability in production scale.
The ICPR workshop on “Computational Aspects of Deep Learning” fosters the submission of research works that focus on the development of optimized deep neural network architectures and on the optimization of existing ones, also onto highly scalable systems. This includes the training on large-scale or highly-dimensional datasets, the design of novel architectures and operators for increasing the efficacy or the efficiency in feature extraction and classification, the optimization of hyperparameters to enhance model’s performance, solutions for training in multi-node systems such as HPC clusters.
The workshop targets any research field related to pattern recognition, ranging from computer vision to natural language processing and multimedia, in which data and computationally intensive architectures are needed to solve key research issues. The workshop also favors positive criticism on the current data-intensive trends in machine learning and will encourage new perspectives and solutions on the matter. Submissions should address computationally intensive scenarios from the point of view of architectural design, data preparation and processing, operator design, training strategies, distributed and large-scale training. Quantitative comparisons of existing solutions and datasets are also welcome to raise awareness on the topic.
PRIZES
===
CADL is organized in collaboration with NVIDIA AI Technology Center. The best paper will be awarded a Titan RTX GPU (or equivalent) offered by NVIDIA.
TOPICS
===
Topics of interest include, but are not limited to, the following:
- Design of innovative architectures and operators for data-intensive scenarios
- Video understanding and spatio-temporal feature extraction
- Distributed reinforcement learning algorithms
- Applications of large-scale pre-training techniques
- Distributed training approaches and architectures
- HPC and massively parallel architectures in Deep Learning
- Frameworks and optimization algorithms for training Deep Networks
- Model pruning, gradient compression techniques to reduce the computational complexity
- Design, implementation and use of hardware accelerators

Last modified: 2020-09-20 15:13:50