[hpc-announce] CFP: 9th Workshop on Scientific Cloud Computing (ScienceCloud)@HPDC 2018

Dmitry Duplyakin dmitry.duplyakin at gmail.com
Thu Feb 22 10:24:05 CST 2018


ScienceCloud'18: 9th Workshop on Scientific Cloud Computing
https://sites.google.com/site/sciencecloudhpdc/

==============================================
                            CALL FOR PAPERS
==============================================

9th Workshop on Scientific Cloud Computing (ScienceCloud)
June 11, 2018. Tempe, Arizona, USA

Co-Located with ACM HPDC 2018 (http://www.hpdc.org/2018/)

-------------------------------------------------------------------------------
IMPORTANT DATES

Paper Submission: April 2, 2018
Acceptance Notification: May 6, 2018
Final Papers: May 11, 2018
Workshop: June 11, 2018

-------------------------------------------------------------------------------
OVERVIEW

The 9th workshop on Scientific Cloud Computing (ScienceCloud) will provide
the scientific community a dedicated forum for discussing new research,
development, and deployment efforts in running these kinds of scientific
computing workloads on Cloud Computing infrastructures. The ScienceCloud
workshop will focus on the use of cloud-based technologies to meet new
compute-intensive and data-intensive scientific challenges that
are not well served by the current supercomputers, grids and HPC clusters.
The workshop will aim to address questions such as: What architectural
changes to the current cloud frameworks (hardware, operating systems,
networking and/or programming models) are needed to support science?
Dynamic information derived from remote instruments and coupled simulation,
and sensor ensembles that stream data for real-time analysis are important
emerging techniques in scientific and cyber-physical engineering systems.
How can cloud technologies enable and adapt to these new scientific
approaches dealing with dynamism? How are scientists using clouds? Are
there scientific
HPC/HTC/MTC workloads that are suitable candidates to take advantage of
emerging
cloud computing resources with high efficiency? Commercial public clouds
provide
easy access to cloud infrastructure for scientists. What are the gaps in
commercial cloud offerings and how can they be adapted for running existing
and novel eScience applications? What benefits exist by adopting the cloud
model,
over clusters, grids, or supercomputers? What factors are limiting
the use of clouds or would make them more usable/efficient?

This workshop encourages interaction and cross-pollination between
those developing applications, algorithms, software, hardware and
networking,
emphasizing scientific computing for such cloud platforms.
We believe the workshop will be an excellent place to help the community
define the current state, determine future goals, and define
architectures and services for future science clouds.

-------------------------------------------------------------------------------
WORKSHOP SCOPE

We invite the submission of original work that is related to the topics
below.
The papers can be either short (4 pages) position papers,
or long (8 pages) research papers.

Topics of interest include (in the context of Cloud Computing):

- Scientific application cases studies on Cloud infrastructures
- Performance evaluation of Cloud environments and technologies
- Fault tolerance and reliability in cloud systems
- Data-intensive workloads and tools on Clouds
- Use of programming models (e.g. Spark, Map-Reduce) and their
implementations
- Storage cloud architectures
- I/O and Big Data management in the Cloud
- Workflow and resource management in the Cloud
- Use of cloud technologies (e.g., NoSQL databases) for scientific
applications
- Data streaming and dynamic applications on Clouds
- Heterogeneous resources (network, storage, compute) and edge/fog
infrastructure
- Many-Task Computing in the Cloud
- Application of cloud concepts in HPC environments or vice versa
- High performance parallel file systems in virtual environments
- Virtualized high performance I/O network interconnects
- Virtualization, containers, and dynamic provisioning
- Distributed Operating Systems
- Many-core computing and accelerators (e.g. GPUs, MIC) in the Cloud
- Analysis of management complexity, cost, variability, and reproducibility
of computations in the cloud and IoT environments

-------------------------------------------------------------------------------
SUBMISSION INSTRUCTIONS

Authors are invited to submit papers with unpublished, original work of not
more than 8 pages of double column text using single spaced 10-point size on
8.5 x 11 inch pages (including all text, figures, and references), as per
ACM
8.5 x 11 manuscript guidelines (document templates can be found at
http://www.acm.org/sigs/publications/proceedings-templates).
Submission implies the willingness of at least one of the authors
to register and present the paper.

Papers conforming to the above guidelines can be submitted through the
workshop's submission system:
https://easychair.org/conferences/?conf=sciencecloud2018.

-------------------------------------------------------------------------------
GENERAL CHAIRS

- Alexandru Costan, Inria/IRISA, France
- Bogdan Nicolae, Argonne National Laboratory, USA
- Dmitry Duplyakin, University of Utah, USA

-------------------------------------------------------------------------------
PROGRAM COMMITTEE

- Roy Campbell, University of Illinois at Urbana-Champaign
- Simon Caton, National College of Ireland
- Kyle Chard, Argonne National Lab
- Ryan Chard, Victoria University of Wellington
- Eric Eide, University of Utah
- Chathura Herath, Indiana University Bloomington
- Daniel S. Katz, University of Illinois Urbana-Champaign
- Ning Liu, IBM
- Pedro Paulo Silva, INRIA
- Beth Plale, Indiana University Bloomington
- Matei Ripeanu, The University of British Columbia
- Josh Simons, VMware
- Teng Wang, Florida State University
- Zhao Zhang, The University of Texas at Austin
- Dongfang Zhao, University of Nevada
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.mcs.anl.gov/mailman/private/hpc-announce/attachments/20180222/e12114e5/attachment.html>


More information about the hpc-announce mailing list