CloudFlow 2012 - First International Workshop on Workflow Models, Systems, Services and Applications in the Cloud
Topics/Call fo Papers
Cloud computing is gaining tremendous momentum in both academia and industry, more and more people are migrating their data and applications into the Cloud. We have observed wide adoption of the MapReduce computing model and the open source Hadoop system for large scale distributed data processing, and a variety of ad hoc mashup techniques that weave together Web applications. However, these are just first steps towards managing complex task and data dependencies in the Cloud, as there are more challenging issues such as large parameter space exploration, data partitioning and distribution, scheduling and optimization, smart reruns, and provenance tracking associated with workflow execution.
Cloud needs structured and mature workflow technologies to handle such issues, and vice versa, as Cloud offers unprecedented scalability to workflow systems, and could potentially change the way we perceive and conduct research and experiments. The scale and complexity of the science and data analytics problems that can be handled can be greatly increased on the Cloud, and the on-demand nature of resource allocation on the Cloud will also help improve resource utilization and user experience.
As Cloud computing provides a paradigm-shifting utility-oriented computing model in terms of the unprecedented size of datacenter-level resource pool and the on-demand resource provisioning mechanism, there are lots of challenges in bringing Cloud and workflows together. We need high level languages and computing models for large scale workflow specification; we need to adapt existing workflow architectures into the Cloud, and integrate workflow systems with Cloud infrastructure and resources; we also need to leverage Cloud data storage technologies to efficiently distribute data over a large number of nodes and explore data locality during computation etc. We organize the CloudFlow workshop as a venue for the workflow and Cloud communities to define models and paradigms, present their state-of-the-art work, share their thoughts and experiences, and explore new directions in realizing workflows in the Cloud.
Cloud needs structured and mature workflow technologies to handle such issues, and vice versa, as Cloud offers unprecedented scalability to workflow systems, and could potentially change the way we perceive and conduct research and experiments. The scale and complexity of the science and data analytics problems that can be handled can be greatly increased on the Cloud, and the on-demand nature of resource allocation on the Cloud will also help improve resource utilization and user experience.
As Cloud computing provides a paradigm-shifting utility-oriented computing model in terms of the unprecedented size of datacenter-level resource pool and the on-demand resource provisioning mechanism, there are lots of challenges in bringing Cloud and workflows together. We need high level languages and computing models for large scale workflow specification; we need to adapt existing workflow architectures into the Cloud, and integrate workflow systems with Cloud infrastructure and resources; we also need to leverage Cloud data storage technologies to efficiently distribute data over a large number of nodes and explore data locality during computation etc. We organize the CloudFlow workshop as a venue for the workflow and Cloud communities to define models and paradigms, present their state-of-the-art work, share their thoughts and experiences, and explore new directions in realizing workflows in the Cloud.
Other CFPs
- Second International Workshop on Accelerators and Hybrid Exascale Systems (AsHES)
- Parallel Computing and Optimization
- The Eighth International Workshop on System Management Techniques, Processes and Services
- 9th High-Performance Grid and Cloud Computing Workshop
- Statistics in Quality Control - Critical decisions-Risks & Basics
Last modified: 2011-09-29 16:38:30