SC-BDA 2013 - 2nd IEEE International Workshop on Scalable Computing for Big Data Analytics (SC-BDA)
Topics/Call fo Papers
Datasets are growing bigger and bigger. The research community and enterprises can easily produce voluminous amount of data. As the amount of data available keep increasing, the ability to compute large data sets and to scale up/down dynamically becomes increasingly important. Scalable Computing is to address the large scale computing to handle high-throughput and data-intensive computing. It requires advanced parallel and distributed computing technologies such as GPGPU, in-memory, in-database, Hadoop, cloud computing to provide highly scalable and efficient solutions for many scientific and engineering problems. As the ability to scale computations is directly related to the resource constraints on the cloud. The performance enhancement in resource provisioning and process/VM management and migration are important issues. In addition to resource provisioning, workflow scheduling of applications, especially ones with big data, is also a key in bringing up the cloud performance. The impact of additional resources have on the applications is somewhat more complex than just simply adding more servers to the cloud. The performance and efficiency also ties to how well virtualization technology is utilized and how information can be securely delivered on the cloud. As cloud can be considered a hostile environment, trust is needed to be established among the users and service providers.
Other CFPs
Last modified: 2013-07-22 22:27:08