[hpc-announce] LTB Deadline extended to Jan 17, 2017 - CFP Load Testing and Benchmarking of Software Systems @ ICPE

Johannes Kroß kross at fortiss.org
Mon Jan 9 10:45:22 CST 2017


=======================================================================
               The Sixth International Workshop on 
      Load Testing and Benchmarking of Software Systems (LTB 2017)     
                     Co-located with ICPE 2017
                         April 23, 2017
                         L'Aquila, Italy
                  http://ltb2017.eecs.yorku.ca/
=======================================================================

Important Dates
- Technical Papers:                   January 17, 2017
- Presentation Track:                 March 20, 2017
- Paper Notification:                 February 1, 2017
- Presentation Notification:          March 24, 2017
- Camera Ready:                       February 19, 2017
- Workshop Date:                      April 23, 2017


Software systems (e.g., smartphone apps, desktop applications, e-commerce 
systems, IoT infrastructures, big data systems, and enterprise systems, etc.) 
have strict requirements on software performance. Failure to meet these 
requirements will cause customer dissatisfaction and negative news coverage. In
addition to conventional functional testing, the performance of these systems 
must be verified through load testing or benchmarking to ensure quality 
service. Load testing examines the behavior of a system by simulating hundreds 
or thousands of users performing tasks at the same time. Benchmarking evaluates
a system's performance and allows to optimize system configurations or compare 
the system with similar systems in the domain.

Load testing and benchmarking software systems are difficult tasks, which 
requires a great understanding of the system under test and customer behavior. 
Practitioners face many challenges such as tooling (choosing and implementing 
the testing tools), environments (software and hardware setup) and time 
(limited time to design, test, and analyze). This oneday workshop brings 
together software testing researchers, practitioners and tool developers to 
discuss the challenges and opportunities of conducting research on load testing
and benchmarking software systems.

We solicit the following two tracks of submissions: technical papers (maximum 4
pages) and presentation track for industry or experience talks (maximum 700 
words extended abstract). Technical papers should follow the standard ACM SIG 
proceedings format and need to be submitted electronically via EasyChair. 
Extended abstracts need to be submitted as abstract only submissions via 
EasyChair. Accepted technical papers will be published in the ICPE 2017 
Proceedings. Materials from the presentation track will not be published in the
ICPE 2017 proceedings, but will be made available on the workshop website. 
Submitted papers can be research papers, position papers, case studies or 
experience reports addressing issues including but not limited to the 
following:

- Efficient and cost-effective test executions

- Rapid and scalable analysis of the measurement results

- Case studies and experience reports on load testing and benchmarking

- Load testing and benchmarking on emerging systems (e.g., adaptive/autonomic 
  systems, big data batch and stream processing systems, and cloud services)

- Load testing and benchmarking in the context of agile software development 
  process

- Using performance models to support load testing and benchmarking

- Building and maintaining load testing and benchmarking as a service

- Efficient test data management for load testing and benchmarking




More information about the hpc-announce mailing list