[hpc-announce] Call for Contributions for the Data Challenge: The 14th ACM/SPEC International Conference on Performance Engineering (ICPE 2023)

Valeria Cardellini cardellini at ing.uniroma2.it
Sun Nov 6 03:20:35 CST 2022

            Call for Contributions for the Data Challenge
                            ICPE 2023
    14th ACM/SPEC International Conference on Performance Engineering

                      April 15 - 19, 2023  
                      Coimbra, Portugal

            Web: https://icpe2023.spec.org
            Twitter: https://twitter.com/ICPEconf
            Submission: https://easychair.org/conferences/?conf=icpe2023 


Data challenge submission: Jan 15, 2023 (AoE)
Notification to the authors: Feb 24, 2023 (AoE)


Data is the foundation of many important decision-making processes in 
performance engineering tasks of modern systems. Data can tell us about 
the past and present of a system's performance, helping us predict 
performance or assess the quality of our systems. In ICPE 2023, we will 
continue to host a data challenge track in its second installment.

In this track, we provide a novel performance dataset from open source 
Java systems collected by Traini et al. and published recently in the 
Empirical Software Engineering journal. Participants are invited to come up 
with new research questions about the dataset and study those. The challenge 
is open-ended: participants can choose the research questions they find most 
interesting. The proposed approaches and/or tools and their findings are 
discussed in short papers and presented in the main conference.

How to participate in the challenge:
- Read the data description
- Think of something cool to do with the data. This can be anything you 
  want, including visualization, analysis, approach or tool
- Implement your idea, evaluate it, and write down your idea and the 
  results in a short paper 

For more information, including the submission guidelines, visit: 

Data description
This year, the challenge dataset is provided by Traini et al., published 
alongside their recent study "Towards effective assessment of steady state 
performance in Java software: Are we there yet?"

The dataset (https://github.com/SEALABQualityGroup/icpe-data-challenge-jmh) 
contains a comprehensive set of performance measurements of 586 microbenchmarks 
from 30 popular Java open source projects (e.g., RxJava, Log4J2, Apache Hive) 
spanning various project domains (e.g., application servers, libraries, 
databases). Microbenchmarks are frequently employed by practitioners to test 
and ensure the adequate performance of their systems. Microbenchmark measurements 
help open source maintainers test performance before landing new system features, 
and identify performance regressions and optimization opportunities. Each 
benchmark was carefully executed using the Java Microbenchmark Harness (JMH) 
framework in a controlled environment to reduce measurement noise: results 
contain, for each benchmark, 3000 measurements batches (JMH iterations) with 
a minimum execution time of 100ms, repeated in 10 runs. This amounts to 
more than 9 billion benchmark invocations for the entire dataset, an experiment 
that lasted ~93 days.

General Chairs
    - Marco Vieira, University of Coimbra, Portugal
    - Valeria Cardellini, University of Rome Tor Vergata, Italy

Data Challenge Track Chairs
    - Diego Costa, The University of Quebec in Montreal, Canada
    - Michele Tucci, Charles University, Czech Republic

Publicity & Social-Media Chairs
    - Naghmeh Ivaki, University of Coimbra, Portugal    
    - Gabriele Russo Russo, University of Rome Tor Vergata, Italy

Finance Chair
    - Nuno Laranjeiro, University of Coimbra, Portugal

Publications Chairs
    - Daniel Sadoc Menasche, Federal University of Rio de Janeiro, Brazil
    - Andrea Marin, Ca Foscari University of Venice, Italy

Web Chair
    - José D’Abruzzo Pereira, University of Coimbra, Portugal

More information about the hpc-announce mailing list