From ketan at ci.uchicago.edu Fri May 1 21:23:02 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Fri, 1 May 2015 21:23:02 -0500 (CDT) Subject: [Swift-commit] r8445 - www/Swift-T Message-ID: <20150502022302.A0BFB1782B1@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-01 21:23:01 -0500 (Fri, 01 May 2015) New Revision: 8445 Modified: www/Swift-T/turbine-sites.html Log: Edison example Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-04-30 17:14:36 UTC (rev 8444) +++ www/Swift-T/turbine-sites.html 2015-05-02 02:23:01 UTC (rev 8445) @@ -2353,7 +2353,7 @@

Edison is a Cray XC30 system at NERSC.

Install setup

-

Load appropriate modules:

+

Load (and unload) appropriate modules:

module unload PrgEnv-intel darshan cray-shmem
@@ -2370,7 +2370,7 @@
 
cd $SCRATCH/exm-0.8.0/c-utils
-./configure --enable-shared --prefix=/scratch2/scratchdirs/ketan/exm-install/c-utils
+./configure --enable-shared --prefix=$SCRATCH/exm-install/c-utils
 make && make install

Install adlb:

@@ -2386,8 +2386,8 @@
cd $SCRATCH/exm-0.8.0/turbine
-./configure --with-adlb=/scratch2/scratchdirs/ketan/exm-install/lb --with-c-utils=/scratch2/scratchdirs/ketan/exm-install/c-utils \
---prefix=/scratch2/scratchdirs/ketan/exm-install/turbine --with-tcl=/global/homes/k/ketan/tcl-install --with-tcl-version=8.6 \
+./configure --with-adlb=$SCRATCH/exm-install/lb --with-c-utils=$SCRATCH/exm-install/c-utils \
+--prefix=$SCRATCH/exm-install/turbine --with-tcl=/global/homes/k/ketan/tcl-install --with-tcl-version=8.6 \
 --with-mpi=/opt/cray/mpt/default/gni/mpich2-gnu/49
 make && make install
@@ -2395,7 +2395,7 @@
cd $SCRATCH/exm-0.8.0/stc
-ant install -Ddist.dir=/scratch2/scratchdirs/ketan/exm-install/stc -Dturbine.home=/scratch2/scratchdirs/ketan/exm-install/turbine
+ant install -Ddist.dir=$SCRATCH/exm-install/stc -Dturbine.home=$SCRATCH/exm-install/turbine
@@ -2406,6 +2406,12 @@
export PATH=$PATH:$SCRATCH/exm-install/stc/bin:$SCRATCH/exm-install/turbine/bin:$SCRATCH/exm-install/turbine/scripts/submit/cray
 source ~/.bash.ext
+

Note that with Swift installed as a module, the above steps will disappear and the only step needed will be to load the module:

+
+
+
module load swift-t
+module load swift-k
+

A simple script

@@ -2423,16 +2429,14 @@ printf("Hello world!"); }
-

Compile the above script using stc:

+

Compile and run the above script using swift-t:

-
stc hello.swift
+
swift-t -m "cray" hello.swift
-

A TCL (.tic) file will be generated on successful compilation. Run the generated TCL file using turbine-cray submit script:

-
-
-
turbine-cray-run.zsh -n 2 hello.tic
-
+

An intermediate TCL (.tic) file will be generated on successful compilation. +The swift-t command builds a job specification script and submits it to the +scheduler:

Output from the above command will be similar to the following:

@@ -2469,9 +2473,69 @@ Application 12141240 resources: utime ~0s, stime ~0s, Rss ~118364, inblocks ~2287, outblocks ~50
+
+

A second example

+

The following example joins multiple files using the Unix cat utility:

+
+
+
import files;
+
+app (file out) cat (file inputs[]) {
+  "/bin/cat" inputs @stdout=out
+}
+
+file joined <"joined.txt"> = cat(glob("*.txt"));
+
+

Save the above script as catsn.swift.

+

Prepare input files as:

+
+
+
echo "contents of a.txt">a.txt
+echo "contents of b.txt">b.txt
+echo "contents of c.txt">c.txt
+
+

Prepare a prerun script as:

+
+
+
#!/bin/bash
+
+cp *.txt $TURBINE_OUTPUT/
+
+

Make sure to set executable permissions for the prerun.sh script:

+
+
+
chmod 755 prerun.sh
+
+

The prerun script makes sure that the input txt files are available in the +working directory ($TURBINE_OUTPUT) at runtime.

+

Run the script as:

+
+
+
swift-t -t i:./prerun.sh -m "cray" catsn.swift
+
+

On successful compilation and job submission, the following output will be produced:

+
+
+
TURBINE_OUTPUT=/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14
+`./swift-t-catsn.Tgs.tic' -> `/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14/swift-t-catsn.Tgs.tic'
+SCRIPT=./swift-t-catsn.Tgs.tic
+PPN=1
+TURBINE_OUTPUT=/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14
+WALLTIME=00:15:00
+PROCS=2
+NODES=2
+wrote: /global/homes/k/ketan/turbine-output/2015/05/01/09/44/14/turbine-cray.sh
+JOB_ID=2821464.edique02
+
+

Inspect the output file joined.txt produced in the $TURBINE_OUTPUT directory:

+
+
+
cat $TURBINE_OUTPUT/joined.txt
+
+

Cloud

@@ -2596,7 +2660,7 @@ From ketan at ci.uchicago.edu Fri May 1 21:37:28 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Fri, 1 May 2015 21:37:28 -0500 (CDT) Subject: [Swift-commit] r8446 - www/Swift-T Message-ID: <20150502023728.3D8931782B1@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-01 21:37:28 -0500 (Fri, 01 May 2015) New Revision: 8446 Modified: www/Swift-T/turbine-sites.html Log: Edison example Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-02 02:23:01 UTC (rev 8445) +++ www/Swift-T/turbine-sites.html 2015-05-02 02:37:28 UTC (rev 8446) @@ -2379,7 +2379,7 @@
cd $SCRATCH/exm-0.8.0/lb
 CFLAGS=-I/opt/cray/mpt/default/gni/mpich2-gnu/49/include
 LDFLAGS="-L/opt/cray/mpt/default/gni/mpich2-gnu/49/lib -lmpich"
-./configure CC=gcc --with-c-utils=/scratch2/scratchdirs/ketan/exm-install/c-utils --prefix=/scratch2/scratchdirs/ketan/exm-install/lb --enable-mpi-2
+./configure CC=gcc --with-c-utils=$SCRATCH/exm-install/c-utils --prefix=$SCRATCH/exm-install/lb --enable-mpi-2
 make && make install

Install turbine:

@@ -2660,7 +2660,7 @@ From ketan at ci.uchicago.edu Mon May 4 11:00:51 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 11:00:51 -0500 (CDT) Subject: [Swift-commit] r8447 - www/Swift-T Message-ID: <20150504160051.184AD9D597@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 11:00:50 -0500 (Mon, 04 May 2015) New Revision: 8447 Modified: www/Swift-T/turbine-sites.html Log: improve documentation Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-02 02:37:28 UTC (rev 8446) +++ www/Swift-T/turbine-sites.html 2015-05-04 16:00:50 UTC (rev 8447) @@ -2352,7 +2352,7 @@

Edison

Edison is a Cray XC30 system at NERSC.

-

Install setup

+

Build Procedure

Load (and unload) appropriate modules:

@@ -2415,7 +2415,7 @@

A simple script

-

Now let us try to compile and run a simple Swift/T script over Edison Compute nodes. Following is a simple "Hello World!" script:

+

To compile and run a simple Swift/T script over Edison Compute nodes. Following is a simple "Hello World!" script:

/**
@@ -2434,7 +2434,15 @@
 
swift-t -m "cray" hello.swift
-

An intermediate TCL (.tic) file will be generated on successful compilation. +

+ + + +
+
Note
+
The -m flag determines the machine type: "cray", "pbs", "cobalt", etc.
+
+

A Turbine Intermediate Code (.tic) file will be generated on successful compilation. The swift-t command builds a job specification script and submits it to the scheduler:

Output from the above command will be similar to the following:

@@ -2454,7 +2462,7 @@

Inspect the results with:

-
cat /global/homes/k/ketan/turbine-output/2015/04/30/09/09/53/output.txt.2816478.edique02.out
+
cat $TURBINE_OUTPUT/output.txt.2816478.edique02.out

The following will be the contents:

@@ -2479,12 +2487,15 @@
import files;
+import string;
 
 app (file out) cat (file inputs[]) {
   "/bin/cat" inputs @stdout=out
 }
 
-file joined <"joined.txt"> = cat(glob("*.txt"));
+foreach i in [0:9]{ + file joined<sprintf("joined%i.txt", i)> = cat(glob("*.txt")); +}

Save the above script as catsn.swift.

Prepare input files as:

@@ -2494,24 +2505,15 @@ echo "contents of b.txt">b.txt echo "contents of c.txt">c.txt
-

Prepare a prerun script as:

+

Set TURBINE_OUTPUT to current directory:

-
#!/bin/bash
-
-cp *.txt $TURBINE_OUTPUT/
+
export TURBINE_OUTPUT=$PWD
-

Make sure to set executable permissions for the prerun.sh script:

-
-
-
chmod 755 prerun.sh
-
-

The prerun script makes sure that the input txt files are available in the -working directory ($TURBINE_OUTPUT) at runtime.

Run the script as:

-
swift-t -t i:./prerun.sh -m "cray" catsn.swift
+
swift-t -m "cray" catsn.swift

On successful compilation and job submission, the following output will be produced:

@@ -2527,10 +2529,10 @@ wrote: /global/homes/k/ketan/turbine-output/2015/05/01/09/44/14/turbine-cray.sh JOB_ID=2821464.edique02
-

Inspect the output file joined.txt produced in the $TURBINE_OUTPUT directory:

+

Inspect one of the output files joined<n>.txt produced in the $TURBINE_OUTPUT directory:

-
cat $TURBINE_OUTPUT/joined.txt
+
cat $TURBINE_OUTPUT/joined4.txt
@@ -2660,7 +2662,7 @@ From ketan at ci.uchicago.edu Mon May 4 11:14:09 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 11:14:09 -0500 (CDT) Subject: [Swift-commit] r8448 - www/Swift-T Message-ID: <20150504161409.11D959D597@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 11:14:08 -0500 (Mon, 04 May 2015) New Revision: 8448 Modified: www/Swift-T/turbine-sites.html Log: minor Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-04 16:00:50 UTC (rev 8447) +++ www/Swift-T/turbine-sites.html 2015-05-04 16:14:08 UTC (rev 8448) @@ -2444,7 +2444,7 @@

A Turbine Intermediate Code (.tic) file will be generated on successful compilation. The swift-t command builds a job specification script and submits it to the -scheduler:

+scheduler.

Output from the above command will be similar to the following:

@@ -2518,16 +2518,16 @@

On successful compilation and job submission, the following output will be produced:

-
TURBINE_OUTPUT=/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14
-`./swift-t-catsn.Tgs.tic' -> `/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14/swift-t-catsn.Tgs.tic'
-SCRIPT=./swift-t-catsn.Tgs.tic
+
TURBINE_OUTPUT=/scratch2/scratchdirs/ketan/ATPESC_2014-08-14/swift-t/examples/catsn/turbine.work
+`./swift-t-catsn.hzS.tic' -> `/scratch2/scratchdirs/ketan/ATPESC_2014-08-14/swift-t/examples/catsn/turbine.work/swift-t-catsn.hzS.tic'
+SCRIPT=./swift-t-catsn.hzS.tic
 PPN=1
-TURBINE_OUTPUT=/global/homes/k/ketan/turbine-output/2015/05/01/09/44/14
+TURBINE_OUTPUT=/scratch2/scratchdirs/ketan/ATPESC_2014-08-14/swift-t/examples/catsn/turbine.work
 WALLTIME=00:15:00
 PROCS=2
 NODES=2
-wrote: /global/homes/k/ketan/turbine-output/2015/05/01/09/44/14/turbine-cray.sh
-JOB_ID=2821464.edique02
+wrote: /scratch2/scratchdirs/ketan/ATPESC_2014-08-14/swift-t/examples/catsn/turbine.work/turbine-cray.sh +JOB_ID=2835290.edique02

Inspect one of the output files joined<n>.txt produced in the $TURBINE_OUTPUT directory:

@@ -2662,7 +2662,7 @@ From ketan at ci.uchicago.edu Mon May 4 11:33:46 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 11:33:46 -0500 (CDT) Subject: [Swift-commit] r8449 - www/Swift-T Message-ID: <20150504163346.20FFD9D5A2@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 11:33:46 -0500 (Mon, 04 May 2015) New Revision: 8449 Modified: www/Swift-T/turbine-sites.html Log: minor Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-04 16:14:08 UTC (rev 8448) +++ www/Swift-T/turbine-sites.html 2015-05-04 16:33:46 UTC (rev 8449) @@ -2483,7 +2483,8 @@

A second example

-

The following example joins multiple files using the Unix cat utility:

+

The following example joins multiple files (n times in parallel) using the Unix +cat utility:

import files;
@@ -2515,7 +2516,7 @@
 
swift-t -m "cray" catsn.swift
-

On successful compilation and job submission, the following output will be produced:

+

On successful compilation and job submission, output similar to the following will be produced:

TURBINE_OUTPUT=/scratch2/scratchdirs/ketan/ATPESC_2014-08-14/swift-t/examples/catsn/turbine.work
@@ -2662,7 +2663,7 @@
 
 



From ketan at ci.uchicago.edu  Mon May  4 13:37:38 2015
From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu)
Date: Mon,  4 May 2015 13:37:38 -0500 (CDT)
Subject: [Swift-commit] r8450 - www/Swift-T
Message-ID: <20150504183738.53BD99D5A2@svn.ci.uchicago.edu>

Author: ketan
Date: 2015-05-04 13:37:38 -0500 (Mon, 04 May 2015)
New Revision: 8450

Modified:
   www/Swift-T/turbine-sites.html
Log:
simplify example

Modified: www/Swift-T/turbine-sites.html
===================================================================
--- www/Swift-T/turbine-sites.html	2015-05-04 16:33:46 UTC (rev 8449)
+++ www/Swift-T/turbine-sites.html	2015-05-04 18:37:38 UTC (rev 8450)
@@ -2490,21 +2490,19 @@
 
import files;
 import string;
 
-app (file out) cat (file inputs[]) {
-  "/bin/cat" inputs @stdout=out
+app (file out) cat (file input) {
+  "/bin/cat" input @stdout=out
 }
 
 foreach i in [0:9]{
-  file joined<sprintf("joined%i.txt", i)> = cat(glob("*.txt"));
+  file joined<sprintf("joined%i.txt", i)> = cat(input_file("data.txt"));
 }

Save the above script as catsn.swift.

-

Prepare input files as:

+

Prepare input file as:

-
echo "contents of a.txt">a.txt
-echo "contents of b.txt">b.txt
-echo "contents of c.txt">c.txt
+
echo "contents of data.txt">data.txt

Set TURBINE_OUTPUT to current directory:

@@ -2663,7 +2661,7 @@ From ketan at ci.uchicago.edu Mon May 4 13:42:40 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 13:42:40 -0500 (CDT) Subject: [Swift-commit] r8451 - www/Swift-T Message-ID: <20150504184240.E1FCC9D5A2@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 13:42:40 -0500 (Mon, 04 May 2015) New Revision: 8451 Modified: www/Swift-T/turbine-sites.html Log: add git Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-04 18:37:38 UTC (rev 8450) +++ www/Swift-T/turbine-sites.html 2015-05-04 18:42:40 UTC (rev 8451) @@ -2359,24 +2359,24 @@
module unload PrgEnv-intel darshan cray-shmem
 module load PrgEnv-gnu java
-

Download the latest exm code:

+

Clone the latest exm code:

cd $SCRATCH
-wget http://swift-lang.org/Swift-T/downloads/exm-0.8.0.tar.gz
-tar zxf exm-0.8.0.tar.gz
+git clone https://github.com/swift-lang/swift-t.git +cd swift-t

Install c-utils:

-
cd $SCRATCH/exm-0.8.0/c-utils
+
cd $SCRATCH/swift-t/c-utils
 ./configure --enable-shared --prefix=$SCRATCH/exm-install/c-utils
 make && make install

Install adlb:

-
cd $SCRATCH/exm-0.8.0/lb
+
cd $SCRATCH/swift-t/lb
 CFLAGS=-I/opt/cray/mpt/default/gni/mpich2-gnu/49/include
 LDFLAGS="-L/opt/cray/mpt/default/gni/mpich2-gnu/49/lib -lmpich"
 ./configure CC=gcc --with-c-utils=$SCRATCH/exm-install/c-utils --prefix=$SCRATCH/exm-install/lb --enable-mpi-2
@@ -2385,7 +2385,7 @@
 

Install turbine:

-
cd $SCRATCH/exm-0.8.0/turbine
+
cd $SCRATCH/swift-t/turbine
 ./configure --with-adlb=$SCRATCH/exm-install/lb --with-c-utils=$SCRATCH/exm-install/c-utils \
 --prefix=$SCRATCH/exm-install/turbine --with-tcl=/global/homes/k/ketan/tcl-install --with-tcl-version=8.6 \
 --with-mpi=/opt/cray/mpt/default/gni/mpich2-gnu/49
@@ -2394,7 +2394,7 @@
 

Install stc:

-
cd $SCRATCH/exm-0.8.0/stc
+
cd $SCRATCH/swift-t/stc
 ant install -Ddist.dir=$SCRATCH/exm-install/stc -Dturbine.home=$SCRATCH/exm-install/turbine
@@ -2661,7 +2661,7 @@ From ketan at ci.uchicago.edu Mon May 4 13:52:20 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 13:52:20 -0500 (CDT) Subject: [Swift-commit] r8452 - www/Swift-T Message-ID: <20150504185220.2C7E79D5A2@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 13:52:20 -0500 (Mon, 04 May 2015) New Revision: 8452 Modified: www/Swift-T/turbine-sites.html Log: added public location Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-04 18:42:40 UTC (rev 8451) +++ www/Swift-T/turbine-sites.html 2015-05-04 18:52:20 UTC (rev 8452) @@ -2350,7 +2350,16 @@

Edison

+
+

Public Installation

+

A public installation may be run at: /scratch2/scratchdirs/ketan/exm-install/stc/bin/swift-t

+

Run with, e.g.:

+
+
+
swift-t -m cray -n 4 program.swift
+

Edison is a Cray XC30 system at NERSC.

+

Build Procedure

Load (and unload) appropriate modules:

@@ -2661,7 +2670,7 @@ From ketan at ci.uchicago.edu Mon May 4 13:53:42 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Mon, 4 May 2015 13:53:42 -0500 (CDT) Subject: [Swift-commit] r8453 - www/Swift-T Message-ID: <20150504185342.4E0499D5A2@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-04 13:53:42 -0500 (Mon, 04 May 2015) New Revision: 8453 Modified: www/Swift-T/turbine-sites.html Log: minor Modified: www/Swift-T/turbine-sites.html =================================================================== --- www/Swift-T/turbine-sites.html 2015-05-04 18:52:20 UTC (rev 8452) +++ www/Swift-T/turbine-sites.html 2015-05-04 18:53:42 UTC (rev 8453) @@ -2350,6 +2350,7 @@

Edison

+

Edison is a Cray XC30 system at NERSC.

Public Installation

A public installation may be run at: /scratch2/scratchdirs/ketan/exm-install/stc/bin/swift-t

@@ -2358,7 +2359,6 @@
swift-t -m cray -n 4 program.swift
-

Edison is a Cray XC30 system at NERSC.

Build Procedure

@@ -2670,7 +2670,7 @@ From ketan at ci.uchicago.edu Tue May 5 11:12:11 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Tue, 5 May 2015 11:12:11 -0500 (CDT) Subject: [Swift-commit] r8454 - in SwiftApps: . cetustukey subjobs Message-ID: <20150505161211.602D21782B1@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-05 11:12:11 -0500 (Tue, 05 May 2015) New Revision: 8454 Added: SwiftApps/cetustukey/ SwiftApps/cetustukey/README.txt SwiftApps/cetustukey/avg SwiftApps/cetustukey/cetustukey.png SwiftApps/cetustukey/cetustukeytut.html SwiftApps/cetustukey/ct.conf SwiftApps/cetustukey/docbuild SwiftApps/cetustukey/local.conf SwiftApps/cetustukey/pi SwiftApps/cetustukey/pi.c SwiftApps/cetustukey/pi.swift SwiftApps/cetustukey/runct SwiftApps/cetustukey/runlocal SwiftApps/cetustukey/runpi Modified: SwiftApps/subjobs/bg.sh Log: cetus tukey Added: SwiftApps/cetustukey/README.txt =================================================================== --- SwiftApps/cetustukey/README.txt (rev 0) +++ SwiftApps/cetustukey/README.txt 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,89 @@ +Running Swift on Cetus and Tukey +================================= + +Introduction +------------ + +This tutorial will walk you through running Swift applications over ALCF Cetus and Tukey systems. The motivation behind this tutorial is to enable use-cases on ALCF systems which need systems such as Mira and Cetus for the analysis and processing phase of their application and systems such as Tukey for postprocessing and visualization phase. The tutorial describes a technique to run both these phases across the two (or more) systems in a seamless manner (that is, without user intervention for authentication purposes). + +Copy Tarball +------------- + +Copy the tarball to your home area on Cetus: + +---- +cp ~ketan/public/cetustukeytut.tgz $HOME +---- + +followed by: + +---- +tar -zxf cetustukeytut.tgz +---- + +Create Tunnel +------------- + +Open a new terminal and log into Cetus. Create an ssh tunnel between Cetus and Tukey: + +---- +cetus$ ssh -L 52000:127.0.0.1:52000 tukey.alcf.anl.gov -N -f +---- + +Start Service +-------------- + +Open a new terminal and log into Tukey. Start the coaster service on Tukey: + +---- +tukey$ /home/ketan/swift-k/dist/swift-svn/bin/coaster-service -p 52000 -nosec +---- + +Systems Interaction +-------------------- +The following figure gives an overview of systems interactions in the setup: + +image:cetustukey.png[] + +Swift Application +------------------ +The Swift application included in this package computes the value of pi using montecarlo method. An MPI based C code computes the value of pi. This code is invoked multiple times in parallel. Each invocation writes the value it computed in an output file. A next application, implemented as a shell script that computes the average of the values found in the collection of output files generated in previous runs. The Swift source code for the applications is as below: + +[source,C] +---- +include::pi.swift[] +---- + +Configuration +-------------- + +The run is controlled by a configuration file called ct.conf which is as follows: + +[source,json] +---- +include::ct.conf[] +---- + +Run Swift +---------- + +Run Swift on Cetus: + +---- +cetus$ cd $HOME/cetustukeytut +cetus$ ./runct +---- + +Results +------- + +On successful completion, the answer, an approximation of pi should be in the produced file +ans.txt+: + +---- +cetus$ cat ans.txt +---- + +Upcoming Features +------------------ +. Extend to other systems. +. Show use of gridftp and/or Globus.org for file transfer. Added: SwiftApps/cetustukey/avg =================================================================== --- SwiftApps/cetustukey/avg (rev 0) +++ SwiftApps/cetustukey/avg 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,16 @@ +#!/bin/bash + +#set -x + +ans=0.0 +count=0.00000000000000001 + +for i in $@ +do + val=$(grep 'pi =' $i | awk 'END{print $3}') + ans=$(echo "scale=20; $val + $ans" | bc) + count=$(echo "scale=20; $count + 1" | bc) +done + +echo "scale=20; $ans / $count" | bc + Property changes on: SwiftApps/cetustukey/avg ___________________________________________________________________ Added: svn:executable + * Added: SwiftApps/cetustukey/cetustukey.png =================================================================== (Binary files differ) Property changes on: SwiftApps/cetustukey/cetustukey.png ___________________________________________________________________ Added: svn:mime-type + application/octet-stream Added: SwiftApps/cetustukey/cetustukeytut.html =================================================================== --- SwiftApps/cetustukey/cetustukeytut.html (rev 0) +++ SwiftApps/cetustukey/cetustukeytut.html 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,945 @@ + + + + + +Running Swift on Cetus and Tukey + + + + + +
+
+

1. Introduction

+
+

This tutorial will walk you through running Swift applications over ALCF Cetus and Tukey systems. The motivation behind this tutorial is to enable use-cases on ALCF systems which need systems such as Mira and Cetus for the analysis and processing phase of their application and systems such as Tukey for postprocessing and visualization phase. The tutorial describes a technique to run both these phases across the two (or more) systems in a seamless manner (that is, without user intervention for authentication purposes).

+
+
+
+

2. Copy Tarball

+
+

Copy the tarball to your home area on Cetus:

+
+
+
cp ~ketan/public/cetustukeytut.tgz $HOME
+
+

followed by:

+
+
+
tar -zxf cetustukeytut.tgz
+
+
+
+
+

3. Create Tunnel

+
+

Open a new terminal and log into Cetus. Create an ssh tunnel between Cetus and Tukey:

+
+
+
cetus$ ssh -L 52000:127.0.0.1:52000 tukey.alcf.anl.gov -N -f
+
+
+
+
+

4. Start Service

+
+

Open a new terminal and log into Tukey. Start the coaster service on Tukey:

+
+
+
tukey$ /home/ketan/swift-k/dist/swift-svn/bin/coaster-service -p 52000 -nosec
+
+
+
+
+

5. Systems Interaction

+
+

The following figure gives an overview of systems interactions in the setup:

+

+cetustukey.png +

+
+
+
+

6. Swift Application

+
+

The Swift application included in this package computes the value of pi using montecarlo method. An MPI based C code computes the value of pi. This code is invoked multiple times in parallel. Each invocation writes the value it computed in an output file. A next application, implemented as a shell script that computes the average of the values found in the collection of output files generated in previous runs. The Swift source code for the applications is as below:

+
+
+
type file;
+type exe;
+
+app (file _o) compute_pi (exe _pi)
+{
+  runpi @_pi stdout=@_o;
+}
+
+app (file _o) avg_pi (exe _avg, file[] _i)
+{
+  bash @_avg @_i stdout=@_o;
+}
+
+exe avg<"avg">;
+exe pi<"pi">;
+
+file pival<"ans.txt">;
+file out[]<simple_mapper; location="outdir", prefix="pi.",suffix=".out">;
+foreach i in [0:9] {
+    out[i] = compute_pi(pi);
+}
+
+pival=avg_pi(avg, out);
+
+
+
+
+

7. Configuration

+
+

The run is controlled by a configuration file called ct.conf which is as follows:

+
+
+
sites : [cetus, tukey]
+site.cetus {
+    execution {
+        type: "coaster"
+        URL: "localhost"
+        jobManager: "local:cobalt"
+        options {
+            maxNodesPerJob: 32
+            maxJobs: 1
+            tasksPerNode: 1
+            nodeGranularity: 32
+            maxJobTime = "00:60:00"
+        }
+    }
+    filesystem {
+        type: "local"
+        URL: "localhost"
+    }
+    staging : direct
+    workDirectory: "/home/"${env.USER}"/swift.work"
+    maxParallelTasks: 30
+    initialParallelTasks: 29
+    app.runpi {
+        executable: "/home/"${env.USER}"/cetustukeytut/runpi"
+        maxWallTime: "00:04:00"
+    }
+}
+
+site.tukey {
+    execution {
+        type: "coaster-persistent"
+        URL: "http://localhost:52000"
+        jobManager: "local:cobalt"
+        options {
+            maxNodesPerJob: 8
+            maxJobs: 1
+            tasksPerNode: 2
+            nodeGranularity: 1
+            maxJobTime = "00:60:00"
+        }
+    }
+    filesystem {
+        type: "local"
+        URL: "localhost"
+    }
+    staging: direct
+    workDirectory: "/home/"${env.USER}"/swift.work"
+    maxParallelTasks: 3
+    initialParallelTasks: 2
+    app.bash {
+        executable: "/bin/bash"
+        maxWallTime: "00:04:00"
+    }
+}
+
+executionRetries: 0
+keepSiteDir: true
+providerStagingPinSwiftFiles: false
+alwaysTransferWrapperLog: true
+
+
+
+
+

8. Run Swift

+
+

Run Swift on Cetus:

+
+
+
cetus$ cd $HOME/cetustukeytut
+cetus$ ./runct
+
+
+
+
+

9. Results

+
+

On successful completion, the answer, an approximation of pi should be in the produced file ans.txt:

+
+
+
cetus$ cat ans.txt
+
+
+
+
+

10. Upcoming Features

+
+
    +
  1. +

    +Extend to other systems. +

    +
  2. +
  3. +

    +Show use of gridftp and/or Globus.org for file transfer. +

    +
  4. +
+
+
+
+

+ + + Added: SwiftApps/cetustukey/ct.conf =================================================================== --- SwiftApps/cetustukey/ct.conf (rev 0) +++ SwiftApps/cetustukey/ct.conf 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,60 @@ +sites : [cetus, tukey] +site.cetus { + execution { + type: "coaster" + URL: "localhost" + jobManager: "local:cobalt" + options { + maxNodesPerJob: 32 + maxJobs: 1 + tasksPerNode: 1 + nodeGranularity: 32 + maxJobTime = "00:60:00" + } + } + filesystem { + type: "local" + URL: "localhost" + } + staging : direct + workDirectory: "/home/"${env.USER}"/swift.work" + maxParallelTasks: 30 + initialParallelTasks: 29 + app.runpi { + executable: "/home/"${env.USER}"/cetustukeytut/runpi" + maxWallTime: "00:04:00" + } +} + +site.tukey { + execution { + type: "coaster-persistent" + URL: "http://localhost:52000" + jobManager: "local:cobalt" + options { + maxNodesPerJob: 8 + maxJobs: 1 + tasksPerNode: 2 + nodeGranularity: 1 + maxJobTime = "00:60:00" + } + } + filesystem { + type: "local" + URL: "localhost" + } + staging: direct + workDirectory: "/home/"${env.USER}"/swift.work" + maxParallelTasks: 3 + initialParallelTasks: 2 + app.bash { + executable: "/bin/bash" + maxWallTime: "00:04:00" + } +} + +executionRetries: 0 +keepSiteDir: true +providerStagingPinSwiftFiles: false +alwaysTransferWrapperLog: true + Added: SwiftApps/cetustukey/docbuild =================================================================== --- SwiftApps/cetustukey/docbuild (rev 0) +++ SwiftApps/cetustukey/docbuild 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,3 @@ +#!/bin/bash + +asciidoc -a toc -a numbered -a toclevels=2 -a max-width=750px -a textwidth=80 -o cetustukeytut.html README.txt Property changes on: SwiftApps/cetustukey/docbuild ___________________________________________________________________ Added: svn:executable + * Added: SwiftApps/cetustukey/local.conf =================================================================== --- SwiftApps/cetustukey/local.conf (rev 0) +++ SwiftApps/cetustukey/local.conf 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,19 @@ +sites: [localhost] + +site.localhost { + execution { + type: "local" + URL : "localhost" + } + staging : direct + workDirectory : "/tmp/"${env.USER}"/swiftwork" + maxParallelTasks : 20 + initialParallelTasks: 20 + app.ALL { executable: "*" } +} + +lazyErrors: false +executionRetries: 0 +keepSiteDir: true +providerStagingPinSwiftFiles: false +alwaysTransferWrapperLog: true Added: SwiftApps/cetustukey/pi =================================================================== (Binary files differ) Property changes on: SwiftApps/cetustukey/pi ___________________________________________________________________ Added: svn:executable + * Added: svn:mime-type + application/octet-stream Added: SwiftApps/cetustukey/pi.c =================================================================== --- SwiftApps/cetustukey/pi.c (rev 0) +++ SwiftApps/cetustukey/pi.c 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,356 @@ +# include +# include +# include + +# include "mpi.h" + + +# define DEBUG 0 +# define CHUNKSIZE 1000 +# define RANDOM_SEED 0 + +/* + Message tags +*/ +# define NEED_NUMBERS 1 +# define RANDOM_NUMBERS 2 + +int main ( int argc, char *argv[] ); +void timestamp ( void ); + +/******************************************************************************/ + +int main ( int argc, char *argv[] ) + +/******************************************************************************/ +/* + Purpose: + + MAIN is the main program for MONTE_CARLO. + + Discussion: + + MONTE_CARLO uses Monte Carlo methods to estimate Pi. + + Generate N random points in the unit square. Count M, the number + of points that are in the quarter circle. Then PI is approximately + equal to the ratio 4 * M / N. + + It's important that each processor use DIFFERENT random numbers. + One way to ensure this is to have a single master processor + generate all the random numbers, and then divide them up. + + (A second way, not explored here, is simply to ensure that each + processor uses a different seed, either chosen by a master processor, + or generated from the processor ID.) + + Licensing: + + This code is distributed under the GNU LGPL license. + + Modified: + + 26 February 2007 + + Author: + + John Burkardt + + Reference: + + William Gropp, Ewing Lusk, Anthony Skjellum, + Using MPI: Portable Parallel Programming with the + Message-Passing Interface, + Second Edition, + MIT Press, 1999, + ISBN: 0262571323. +*/ +{ + double calculatedPi; + int done; + double error; + int i; + int ierr; + int in; + int max; + MPI_Status mesgStatus; + int my_id; + int numprocs; + int out; + int point_max = 1000000; + int randServer; + int randNums[CHUNKSIZE]; + int ranks[1]; + int request; + int temp; + double tolerance; + int totalin; + int totalout; + MPI_Group worker_group; + MPI_Comm workers; + MPI_Group world_group; + double wtime; + double x; + double y; +/* + Initialize MPI. +*/ + ierr = MPI_Init ( &argc, &argv ); +/* + Get the number of processors. +*/ + ierr = MPI_Comm_size ( MPI_COMM_WORLD, &numprocs ); +/* + Get the rank of this processor. +*/ + ierr = MPI_Comm_rank ( MPI_COMM_WORLD, &my_id ); + + if ( my_id == 0 ) + { + timestamp ( ); + printf ( "\n" ); + printf ( "MONTE_CARLO - Master process:\n" ); + printf ( " C version\n" ); + printf ( " An MPI example program.\n" ); + printf ( " Estimate pi by the Monte Carlo method, using MPI.\n" ); + printf ( "\n" ); + printf ( " Compiled on %s at %s.\n", __DATE__, __TIME__ ); + printf ( "\n" ); + printf ( " The number of processes is %d.\n", numprocs ); + printf ( "\n" ); + printf ( " Points in the unit square will be tested\n" ); + printf ( " to see if they lie in the unit quarter circle.\n" ); + } + + if ( my_id == 0 ) + { + wtime = MPI_Wtime ( ); + } +/* + Pretend that the tolerance TOLERANCE is supplied externally + to the master process, which must then broadcast it to all + other processes. +*/ + if ( my_id == 0 ) + { + tolerance = 0.0001; + + printf ( "\n" ); + printf ( " The method will continue to improve the estimate until:\n" ); + printf ( " PI is computed to within a tolerance = %f,\n", tolerance ); + printf ( " or the number of points examined reaches %d.\n", point_max ); + } + + ierr = MPI_Bcast ( &tolerance, 1, MPI_DOUBLE_PRECISION, 0, + MPI_COMM_WORLD ); + + printf ( " Process %d is active.\n", my_id ); +/* + Start by getting the group corresponding to the world communicator. +*/ + ierr = MPI_Comm_group ( MPI_COMM_WORLD, &world_group ); +/* + Put SERVER on the list of processes to exclude, and create the new + worker group. +*/ + randServer = numprocs-1; + ranks[0] = randServer; + ierr = MPI_Group_excl ( world_group, 1, ranks, &worker_group ); +/* + Use the worker group to create the new worker communicator. +*/ + ierr = MPI_Comm_create ( MPI_COMM_WORLD, worker_group, &workers ); +/* + Since we only needed the worker group to create the worker + communicator, we can free the worker group now. +*/ + ierr = MPI_Group_free ( &worker_group ); +/* + Here is where the computation is carried out. +*/ + +/* + I am the rand server. +*/ + if ( my_id == randServer ) + { +# if RANDOM_SEED + struct timeval time; + gettimeofday( &time, 0 ); +/* + Initialize the random number generator +*/ + srandom ( (int)(time.tv_usec*1000000+time.tv_sec) ); +# endif + do + { + ierr = MPI_Recv ( &request, 1, MPI_INT, MPI_ANY_SOURCE, NEED_NUMBERS, + MPI_COMM_WORLD, &mesgStatus ); + + if ( request ) + { + for ( i = 0; i < CHUNKSIZE; i++) + { + randNums[i] = random(); + } + ierr = MPI_Send ( randNums, CHUNKSIZE, MPI_INT, + mesgStatus.MPI_SOURCE, RANDOM_NUMBERS, MPI_COMM_WORLD ); + } + } while ( 0 < request ); + + } +/* + I am a worker process. +*/ + else + { + request = 1; + done = in = out = 0; + max = 2147483647; + + ierr = MPI_Send ( &request, 1, MPI_INT, randServer, NEED_NUMBERS, + MPI_COMM_WORLD ); +/* + Request a string of random numbers. +*/ + while (!done) + { + request = 1; + ierr = MPI_Recv ( randNums, CHUNKSIZE, MPI_INT, randServer, + RANDOM_NUMBERS, MPI_COMM_WORLD, &mesgStatus ); + + for ( i = 0; i < CHUNKSIZE; ) + { + x = ( ( float ) randNums[i++] ) / max; + y = ( ( float ) randNums[i++] ) / max; + + if ( x * x + y * y < 1.0 ) + { + in++; + } + else + { + out++; + } + + } + + temp = in; + ierr = MPI_Reduce ( &temp, &totalin, 1, MPI_INT, MPI_SUM, 0, workers ); +/* + Count total of ins. +*/ + temp = out; + ierr = MPI_Reduce ( &temp, &totalout, 1, MPI_INT, MPI_SUM, 0, workers ); +/* + Count total of outs. +*/ + if ( my_id == 0 ) + { + calculatedPi = ( 4.0 * totalin ) / ( totalin + totalout ); + error = fabs ( calculatedPi - 3.141592653589793238462643 ); + done = ( error < tolerance ) || point_max <= ( totalin + totalout ); + printf( "pi = %23.20lf\n", calculatedPi ); + + if ( done ) + { + request = 0; + } + else + { + request = 1; + } + + ierr = MPI_Send ( &request, 1, MPI_INT, randServer, NEED_NUMBERS, + MPI_COMM_WORLD ); + + ierr = MPI_Bcast ( &done, 1, MPI_INT, 0, workers ); + } + else + { + ierr = MPI_Bcast ( &done, 1, MPI_INT, 0, workers ); + + if ( !done ) + { + request = 1; + ierr = MPI_Send ( &request, 1, MPI_INT, randServer, NEED_NUMBERS, + MPI_COMM_WORLD ); + } + } + } + } + + if ( my_id == 0 ) + { + printf( "\npoints: %d\nin: %d, out: %d\n", totalin + totalout, totalin, + totalout ); + + wtime = MPI_Wtime ( ) - wtime; + printf ( "\n" ); + printf ( " Elapsed wallclock time = %g seconds.\n", wtime ); + } +/* + Terminate MPI. +*/ + ierr = MPI_Finalize(); +/* + Terminate. +*/ + if ( my_id == 0 ) + { + printf ( "\n" ); + printf ( "MONTE_CARLO - Master process:\n" ); + printf ( " Normal end of execution.\n" ); + printf ( "\n" ); + timestamp ( ); + } + return 0; +} +/******************************************************************************/ + +void timestamp ( void ) + +/******************************************************************************/ +/* + Purpose: + + TIMESTAMP prints the current YMDHMS date as a time stamp. + + Example: + + 31 May 2001 09:45:54 AM + + Licensing: + + This code is distributed under the GNU LGPL license. + + Modified: + + 24 September 2003 + + Author: + + John Burkardt + + Parameters: + + None +*/ +{ +# define TIME_SIZE 40 + + static char time_buffer[TIME_SIZE]; + const struct tm *tm; + time_t now; + + now = time ( NULL ); + tm = localtime ( &now ); + + strftime ( time_buffer, TIME_SIZE, "%d %B %Y %I:%M:%S %p", tm ); + + printf ( "%s\n", time_buffer ); + + return; +# undef TIME_SIZE +} + Added: SwiftApps/cetustukey/pi.swift =================================================================== --- SwiftApps/cetustukey/pi.swift (rev 0) +++ SwiftApps/cetustukey/pi.swift 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,24 @@ +type file; +type exe; + +app (file _o) compute_pi (exe _pi) +{ + runpi @_pi stdout=@_o; +} + +app (file _o) avg_pi (exe _avg, file[] _i) +{ + bash @_avg @_i stdout=@_o; +} + +exe avg<"avg">; +exe pi<"pi">; + +file pival<"ans.txt">; +file out[]; +foreach i in [0:9] { + out[i] = compute_pi(pi); +} + +pival=avg_pi(avg, out); + Added: SwiftApps/cetustukey/runct =================================================================== --- SwiftApps/cetustukey/runct (rev 0) +++ SwiftApps/cetustukey/runct 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,4 @@ +#! /bin/sh + +/home/ketan/swift-k/dist/swift-svn/bin/swift -config ct.conf -reducedLogging -minimalLogging pi.swift + Property changes on: SwiftApps/cetustukey/runct ___________________________________________________________________ Added: svn:executable + * Added: SwiftApps/cetustukey/runlocal =================================================================== --- SwiftApps/cetustukey/runlocal (rev 0) +++ SwiftApps/cetustukey/runlocal 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,3 @@ +#!/bin/bash + +/homes/ketan/swift-k/dist/swift-svn/bin/swift -config local.conf pi.swift Property changes on: SwiftApps/cetustukey/runlocal ___________________________________________________________________ Added: svn:executable + * Added: SwiftApps/cetustukey/runpi =================================================================== --- SwiftApps/cetustukey/runpi (rev 0) +++ SwiftApps/cetustukey/runpi 2015-05-05 16:12:11 UTC (rev 8454) @@ -0,0 +1,3 @@ +#!/bin/bash + +runjob --block $COBALT_PARTNAME : "$@" Property changes on: SwiftApps/cetustukey/runpi ___________________________________________________________________ Added: svn:executable + * Modified: SwiftApps/subjobs/bg.sh =================================================================== --- SwiftApps/subjobs/bg.sh 2015-05-04 18:53:42 UTC (rev 8453) +++ SwiftApps/subjobs/bg.sh 2015-05-05 16:12:11 UTC (rev 8454) @@ -86,14 +86,14 @@ CORNER=${SWIFT_SUBBLOCK_ARRAY[$SWIFT_JOB_SLOT]} #Some logging - processedargs=$(echo "$@" | cut -d" " -f 1) + #processedargs=$(echo "$@" | cut -d" " -f 1) echo "$0": running BLOCK="$COBALT_PARTNAME" SLOT="$SWIFT_JOB_SLOT" echo "$0": running cmd: "$0" args: "$@" - echo "$0": runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$processedargs" + echo "$0": runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$@" #without timeout #runjob --strace none --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 16 --np "$((16*$SUBBLOCK_SIZE))" : "$@" - runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$processedargs" + runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$@" echo "Runjob finished." fi From yadunandb at ci.uchicago.edu Thu May 7 17:39:23 2015 From: yadunandb at ci.uchicago.edu (yadunandb at ci.uchicago.edu) Date: Thu, 7 May 2015 17:39:23 -0500 (CDT) Subject: [Swift-commit] r8455 - www/main Message-ID: <20150507223923.DC0769D0DD@svn.ci.uchicago.edu> Author: yadunandb Date: 2015-05-07 17:39:23 -0500 (Thu, 07 May 2015) New Revision: 8455 Modified: www/main/index.php Log: Updates to main page for 0.96 Modified: www/main/index.php =================================================================== --- www/main/index.php 2015-05-05 16:12:11 UTC (rev 8454) +++ www/main/index.php 2015-05-07 22:39:23 UTC (rev 8455) @@ -60,13 +60,13 @@
- Join the Swift community or contact us at info at swift-lang.org
+ Join the Swift community or contact us at swift-user mailing list
Try Swift code examples right from your browser.
Try the tutorial and start using Swift today!
-
0.94.1 current version
2013/09/30 +
0.96.0 current version
2015/05/06
From ketan at ci.uchicago.edu Fri May 8 10:07:35 2015 From: ketan at ci.uchicago.edu (ketan at ci.uchicago.edu) Date: Fri, 8 May 2015 10:07:35 -0500 (CDT) Subject: [Swift-commit] r8456 - SwiftApps/subjobs Message-ID: <20150508150735.5FABD1782B1@svn.ci.uchicago.edu> Author: ketan Date: 2015-05-08 10:07:34 -0500 (Fri, 08 May 2015) New Revision: 8456 Added: SwiftApps/subjobs/vaspbg.sh Modified: SwiftApps/subjobs/bg.sh Log: adding vasp related script Modified: SwiftApps/subjobs/bg.sh =================================================================== --- SwiftApps/subjobs/bg.sh 2015-05-07 22:39:23 UTC (rev 8455) +++ SwiftApps/subjobs/bg.sh 2015-05-08 15:07:34 UTC (rev 8456) @@ -89,11 +89,11 @@ #processedargs=$(echo "$@" | cut -d" " -f 1) echo "$0": running BLOCK="$COBALT_PARTNAME" SLOT="$SWIFT_JOB_SLOT" echo "$0": running cmd: "$0" args: "$@" - echo "$0": runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$@" + echo "$0": runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 16 --np "$((16*$SUBBLOCK_SIZE))" : "$@" #without timeout #runjob --strace none --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 16 --np "$((16*$SUBBLOCK_SIZE))" : "$@" - runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 1 : "$@" + runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 16 --np "$((16*$SUBBLOCK_SIZE))" : "$@" echo "Runjob finished." fi Added: SwiftApps/subjobs/vaspbg.sh =================================================================== --- SwiftApps/subjobs/vaspbg.sh (rev 0) +++ SwiftApps/subjobs/vaspbg.sh 2015-05-08 15:07:34 UTC (rev 8456) @@ -0,0 +1,124 @@ +#!/bin/bash + +#set -x + +mname=$(hostname) + +incarfile=$2 +poscarfile=$3 +potcarfile=$4 +kpointsfile=$5 + +outcarfile=$6 +contcarfile=$7 + +cp $incarfile . +cp $poscarfile . +cp $potcarfile . +cp $kpointsfile . + +# vesta and mira has different path than cetus +if [[ $mname == *vesta* || $mname == *mira* ]] +then + export PATH=/soft/cobalt/bgq_hardware_mapper:$PATH +else + export PATH=/soft/cobalt/cetus/bgq_hardware_mapper:$PATH +fi + +#Run the preprocessing script +#/bin/bash $preproc "$@" + +#export SUBBLOCK_SIZE=16 + +# Prepare shape based on subblock size +# provided by user in sites environment +case "$SUBBLOCK_SIZE" in +1) SHAPE="1x1x1x1x1" +;; +8) SHAPE="1x2x2x2x1" +;; +16) SHAPE="2x2x2x2x1" +;; +32) SHAPE="2x2x2x2x2" +;; +64) SHAPE="2x2x4x2x2" +;; +128) SHAPE="2x4x4x2x2" +;; +256) SHAPE="2x4x4x4x2" +;; +512) SHAPE="4x4x4x4x2" +;; +*) echo "SUBBLOCK_SIZE not set or incorrectly set: will not use subblock jobs" +;; +esac + +# If subblock size is provided, do subblock business +if [ "$SUBBLOCK_SIZE"_ != "_" ] +then + # sub-block size larger than 512 nodes, currently untested + if [ "$SUBBLOCK_SIZE" -gt 512 ] + then + export SWIFT_SUBBLOCKS=$(get-bootable-blocks --size $SUBBLOCK_SIZE $COBALT_PARTNAME) + export SWIFT_SUBBLOCK_ARRAY=($SWIFT_SUBBLOCKS) + + if [ "_$SWIFT_SUBBLOCKS" = _ ]; then + echo ERROR: "$0": SWIFT_SUBBLOCKS is null. + exit 1 + fi + BLOCK=${SWIFT_SUBBLOCK_ARRAY[$SWIFT_JOB_SLOT]} + + #Some logging + echo "$0": running BLOCK="$BLOCK" SLOT="$SWIFT_JOB_SLOT" + echo "$0": running cmd: "$0" args: "$@" + echo "$0": running runjob --block "$BLOCK" : "$@" + + boot-block --block $BLOCK + runjob --block $BLOCK : "$@" + boot-block --block $BLOCK --free + + echo "Runjob finished" + + else + export SWIFT_SUBBLOCKS=$(get-corners.py "$COBALT_PARTNAME" $SHAPE) + export SWIFT_SUBBLOCK_ARRAY=($SWIFT_SUBBLOCKS) + + if [ "_$SWIFT_SUBBLOCKS" = _ ]; then + echo ERROR: "$0": SWIFT_SUBBLOCKS is null. + exit 1 + fi + + nsb=${#SWIFT_SUBBLOCK_ARRAY[@]} + + CORNER=${SWIFT_SUBBLOCK_ARRAY[$SWIFT_JOB_SLOT]} + + #Some logging + #processedargs=$(echo "$@" | cut -d" " -f 1) + echo "$0": running BLOCK="$COBALT_PARTNAME" SLOT="$SWIFT_JOB_SLOT" + echo "$0": running cmd: "$0" args: "$@" + echo "$0": runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 8 --np "$((8*$SUBBLOCK_SIZE))" : "$@" + + #without timeout + #runjob --strace none --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 16 --np "$((16*$SUBBLOCK_SIZE))" : "$@" + runjob --block "$COBALT_PARTNAME" --corner "$CORNER" --shape "$SHAPE" -p 8 --np "$((8*$SUBBLOCK_SIZE))" : $1 + + echo "Runjob finished." + fi +else + # run w/o subblocks if no subblock size provided + echo "Running in nonsubblock mode." + echo "$0": running runjob -p 16 --block $COBALT_PARTNAME : "$@" + + #strace -o "$HOME/strace.runjob.out" runjob --strace none -p 16 --block $COBALT_PARTNAME : "$@" + runjob -p 16 --block $COBALT_PARTNAME : "$@" + + echo "Finished Running in nonsubblock mode." +fi + +#Run the postprocessing script +#/bin/bash $postproc "$@" +mv OUTCAR $outcarfile +mv CONTCAR $contcarfile + +exit 0 + Property changes on: SwiftApps/subjobs/vaspbg.sh ___________________________________________________________________ Added: svn:executable + * From yadunandb at ci.uchicago.edu Fri May 8 11:01:22 2015 From: yadunandb at ci.uchicago.edu (yadunandb at ci.uchicago.edu) Date: Fri, 8 May 2015 11:01:22 -0500 (CDT) Subject: [Swift-commit] r8457 - in www: docs downloads inc Message-ID: <20150508160122.3E8131782B1@svn.ci.uchicago.edu> Author: yadunandb Date: 2015-05-08 11:01:22 -0500 (Fri, 08 May 2015) New Revision: 8457 Modified: www/docs/index.php www/downloads/index.php www/inc/downloads_sidebar.php Log: Updating docs to reflect 0.96 release Modified: www/docs/index.php =================================================================== --- www/docs/index.php 2015-05-08 15:07:34 UTC (rev 8456) +++ www/docs/index.php 2015-05-08 16:01:22 UTC (rev 8457) @@ -29,7 +29,7 @@ -