ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

LTB 2018 - Seventh International Workshop on Load Testing and Benchmarking of Software Systems

Date2018-04-09 - 2018-04-12

Deadline2018-01-15

VenueBerlin, Germany Germany

Keywords

Websitehttp://ltb2018.eecs.yorku.ca

Topics/Call fo Papers

Software systems (e.g., smartphone apps, desktop applications, e-commerce systems, IoT infrastructures, big data systems, and enterprise systems, etc.) have strict requirements on software performance. Failure to meet these requirements will cause customer dissatisfaction and negative news coverage. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service. Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking evaluates a system's performance and allows to optimize system configurations or compare the system with similar systems in the domain.
Load testing and benchmarking software systems are difficult tasks, which requires a great understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup) and time (limited time to design, test, and analyze). This one-day workshop brings together software testing researchers, practitioners and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems.
We solicit the following two tracks of submissions: research papers (maximum 4 pages) and presentation track for industry or experience talks (maximum 700 words extended abstract). Technical papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via EasyChair. Short abstracts for the presentation track need to be submitted as "abstract only" submissions via EasyChair. Accepted technical papers will be published in the ICPE 2018 Proceedings. Materials from the presentation track will not be published in the ICPE 2018 proceedings, but will be made available on the workshop website. Submitted papers can be research papers, position papers, case studies or experience reports addressing issues including but not limited to the following:
Efficient and cost-effective test executions
Rapid and scalable analysis of the measurement results
Case studies and experience reports on load testing and benchmarking
Load testing and benchmarking on emerging systems (e.g., adaptive/autonomic systems, big data batch and stream processing systems, and cloud services)
Load testing and benchmarking in the context of agile software development process
Using performance models to support load testing and benchmarking
Building and maintaining load testing and benchmarking as a service
Efficient test data management for load testing and benchmarking

Last modified: 2017-12-05 15:49:26