[hpc-announce] CFP 2nd Practical Reproducible Evaluation of Computer Systems Workshop

Lofstead, Gerald F II gflofst at sandia.gov
Wed Mar 27 07:49:03 CDT 2019

2nd International Workshop on Practical Reproducible Evaluation of Computer Systems

The P-RECS’19 workshop will be held as a full-day meeting at ACM HPDC 2019 in Phoenix, Arizona, USA on June 24th, 2019. This year, HPDC runs under the ACM Federated Computing Research Conference. This large federated conference will assemble 11 affiliated conferences and will provide excellent opportunities for interdisciplinary networking and learning.

P-RECS workshop focuses heavily on practical, actionable aspects of reproducibility in broad areas of computational science and data exploration, with special emphasis on issues in which community collaboration can be essential for adopting novel methodologies, techniques and frameworks aimed at addressing some of the challenges we face today. The workshop brings together researchers and experts to share experiences and advance the state of the art in the reproducible evaluation of computer systems, featuring contributed papers and invited talks.

We expect submissions from topics such as, but not limited to:

    Experiment dependency management.
    Software citation and persistence.
    Data versioning and preservation.
    Provenance of data-intensive experiments.
    Tools and techniques for incorporating provenance into publications.
    Automated experiment execution and validation.
    Experiment portability for code, performance, and related metrics.
    Experiment discoverability for re-use.
    Cost-benefit analysis frameworks for reproducibility.
    Usability and adaptability of reproducibility frameworks into already-established domain-specific tools.
    Long-term artifact archiving for future reproducibility.
    Frameworks for sociological constructs to incentivize paradigm shifts.
    Policies around publication of articles/software.
    Blinding and selecting artifacts for review while maintaining history.
    Reproducibility-aware computational infrastructure.


Submit (single-blind) via EasyChair.


We look for two categories of submissions:

    Position papers. This category is for papers whose goal is to propose solutions (or scope the work that needs to be done) to address some of the issues outlined above. We hope that a research agenda comes out of this and that we can create a community that meets yearly to report on our status in addressing these problems.

    Experience papers. This category consists of papers reporting on the authors’ experience in automating one or more experimentation pipelines. The committee will look for submissions reporting on their experience: what worked? What aspects of experiment automation and validation are hard in your domain? What can be done to improve the tooling for your domain? As part of the submission, authors need to provide a URL to the automation service they use (e.g., TravisCI, GitLabCI, CircleCI, Jenkins, etc.) so reviewers can verify that there is one or more automated pipelines associated to the submission.


Authors are invited to submit manuscripts in English not exceeding 5 pages of content. The 5-page limit includes figures, tables and appendices, but does not include references, for which there is no page limit. Submissions must use the ACM Master Template (please use the sigconf format with default options).


The proceedings will be archived in both the ACM Digital Library and IEEE Xplore through SIGHPC. In addition, pre-print versions of the accepted articles will be published in this website (as allowed by ACM’s publishing policy).

These tools can be used to automate your experiments (not an exhaustive list): CK, CWL, Popper, ReproZip, Sciunit, Sumatra.

Important Dates

    Submissions due: April 9, 2019 (AoE)
    Acceptance notification: April 30, 2019
    Camera-ready paper submission: May 9, 2019
    Workshop: June 24, 2019


    Ivo Jimenez, UC Santa Cruz
    Carlos Maltzahn, UC Santa Cruz
    Jay Lofstead, Sandia National Laboratories

Program Committee

    Jay Billings, Oak Ridge National Laboratory
    Ronald Boisvert, NIST
    Bruce R. Childers, University of Pittsburgh
    Fernando Chirigati, New York University
    Neil Chue Hong, Software Sustainability Institute, EPCC, University of Edinburgh
    Robert Clay, Sandia National Labs
    Michael Crusoe, Common Workflow Language
    Dmitry Duplyakin, University of Utah
    Torsten Hoefler, ETH Zurich
    Fatma Imamoglu, University of California, Berkeley
    Daniel S. Katz, University of Illinois Urbana-Champaign
    Arnaud Legrand, CNRS / Inria / University of Grenoble
    Tanu Malik, DePaul University
    Robert Ricci, University of Utah
    Victoria Stodden, University of Illinois at Urbana-Champaign
    Violet Syrotiuk, Arizona State University
    Michela Taufer, University of Tennessee Knoxville


Please address workshop questions to ivo at cs.ucsc.edu.

More information about the hpc-announce mailing list