ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

MPP 2020 - 8th Workshop on Parallel Programming Models - Special Edition on IoT and Machine Learning

Date2020-05-18

Deadline2020-01-27

VenueNew Orleans, Louisiana, USA - United States USA - United States

Keywords

Websitehttp://www.ipdps.org/ipdps2020

Topics/Call fo Papers

Recent publications show that, since it is clear that machine learning has become ubiquitous, it is necessary to produce research to increase currently ML algorithms' performance, power consumption and, most importantly as of recently, security. The performance issues are quite clear to understand and have been addressed by multiple researchers either with new algorithms for model compression/parallelization or with new specific hardware. Roughly speaking, it is possible to categorize the security threats against ML models in privacy leakage and model evasion ("fooling" the system into making wrong decisions). This edition of the Workshop on Parallel Programming Models (MPP) is focused on addressing theses issues which are of the utmost importance for the Machine Learning community at this point.
When addressing the performance aspect of Machine Learning, there is also the issue of the amount of data used for training deep-learning models. In the case of Big Data, the application of in-memory computing (which was the main topic in MPP 2019) can be essential to reduce the gap between data and the ML model, in terms of latency.
MPP aims at bringing together researchers interested in presenting contributions to the evolution of existing models or in proposing novel ones, considering the trends on Machine Learning, In-Memory Computing and Security. MPP 2020 will be held in conjunction with The 34th IEEE International Parallel and Distributed Processing Symposium (IPDPS 2020), in New Orleans, Louisiana, USA, on May 22, 2020.
Submission Guidelines
MPP invites authors to submit unpublished full or short papers on the subject. Submissions must be in English and follow the IEEE formatting guidelines. Page limits yet to be determined.
List of Topics
Topics of interest include (but are not limited to):
Compression of Deep-Learning Models;
Tools for ML Model design;
Hardware specifically designed for Machine Learning;
In-Memory Computing;
Novel Deep Neural Networks architectures;
Error Detection/Recovery in ML systems;
Robust Neural Networks;
Privacy of data in ML systems;
Robustness of decision making ML systems;
Neural networks inference and training on IoT, Fog, Edge and cloud environments;
Machine Learning for Parallel Applications and IoT.

Last modified: 2019-10-23 01:40:24