[petsc-users] Issue when calling multiple processors from PETSc
Ajit Desai
ajit.ndesai at gmail.com
Thu Dec 24 10:16:40 CST 2015
Hello everyone,
New user of PETSc and trying to get familiar with it for parallel computing
applications.
I wrote simple hello world code using PETSc environment and found a issue
with the output. For instance :
When I compile and execute "PETSc_helloWorld.F90" code with multiple
processor. It prints following output.
MacUser$ *mpiexec -np 4 ./PETSc_helloWorld*
* Hello from PETSc World, rank: 0 of total 1 processes.*
* Hello from PETSc World, rank: 0 of total 1 processes.*
* Hello from PETSc World, rank: 0 of total 1 processes.*
* Hello from PETSc World, rank: 0 of total 1 processes.*
Similar code with MPI prints following output:
MacUser
$
*mpiexec -np 4 ./a.out *
* Hello from MPI World, rank: 0 *
* *
* of total 4 processes.*
* Hello from MPI World, rank: 1 *
* *
* of total 4 processes.*
* Hello from MPI World, rank: 2 *
* *
* of total 4 processes.*
* Hello from MPI World, rank: 3 of total 4 processes. *
I am not sure is this the issue with the PETSc installation in my machine
or am I missing something here?
I have attached both codes and *log_summary* of PETSc for your convenience.
Thanks & Regards,
*Ajit Desai*
*--*
* Carleton University *
* Ottawa, Canada*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151224/8844f42a/attachment-0001.html>
-------------- next part --------------
************************************************************************************************************************
*** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***
************************************************************************************************************************
---------------------------------------------- PETSc Performance Summary: ----------------------------------------------
./PETSc_helloWorld on a arch-darwin-c-opt named Ajits-MacBook-Pro.local with 1 processor, by ajit Thu Dec 24 11:13:35 2015
Using Petsc Release Version 3.6.3, Dec, 03, 2015
Max Max/Min Avg Total
Time (sec): 1.687e-04 1.00000 1.687e-04
Objects: 1.000e+00 1.00000 1.000e+00
Flops: 0.000e+00 0.00000 0.000e+00 0.000e+00
Flops/sec: 0.000e+00 0.00000 0.000e+00 0.000e+00
MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00
MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00
MPI Reductions: 0.000e+00 0.00000
Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
e.g., VecAXPY() for real vectors of length N --> 2N flops
and VecAXPY() for complex vectors of length N --> 8N flops
Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions --
Avg %Total Avg %Total counts %Total Avg %Total counts %Total
0: Main Stage: 1.5859e-04 94.0% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
Count: number of times phase was executed
Time and Flops: Max - maximum over all processors
Ratio - ratio of maximum to minimum over all processors
Mess: number of messages sent
Avg. len: average message length (bytes)
Reduct: number of global reductions
Global: entire computation
Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
%T - percent time in this phase %F - percent flops in this phase
%M - percent messages in this phase %L - percent message lengths in this phase
%R - percent reductions in this phase
Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event Count Time (sec) Flops --- Global --- --- Stage --- Total
Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------
--- Event Stage 0: Main Stage
------------------------------------------------------------------------------------------------------------------------
Memory usage is given in bytes:
Object Type Creations Destructions Memory Descendants' Mem.
Reports information only for process 0.
--- Event Stage 0: Main Stage
Viewer 1 0 0 0
========================================================================================================================
Average time to get PetscTime(): 3.11e-08
#PETSc Option Table entries:
-log_summary
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --with-debugging=no
-----------------------------------------
Libraries compiled on Mon Dec 14 13:03:33 2015 on Ajits-MacBook-Pro.local
Machine characteristics: Darwin-15.2.0-x86_64-i386-64bit
Using PETSc directory: /Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3
Using PETSc arch: arch-darwin-c-opt
-----------------------------------------
Using C compiler: /Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/bin/mpif90 -fPIC -Wall -Wno-unused-variable -ffree-line-length-0 -Wno-unused-dummy-argument -O ${FOPTFLAGS} ${FFLAGS}
-----------------------------------------
Using include paths: -I/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/include -I/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/include -I/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/include -I/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/include -I/opt/X11/include -I/opt/local/include
-----------------------------------------
Using C linker: /Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/bin/mpicc
Using Fortran linker: /Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/bin/mpif90
Using libraries: -Wl,-rpath,/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -L/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -lpetsc -Wl,-rpath,/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -L/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -lflapack -lfblas -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib -lX11 -Wl,-rpath,/opt/local/lib -L/opt/local/lib -lhwloc -Wl,-rpath,/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/7.0.2/lib/darwin -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/7.0.2/lib/darwin -lmpifort -lgfortran -Wl,-rpath,/usr/local/gfortran/lib/gcc/x86_64-apple-darwin14/4.9.2 -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin14/4.9.2 -Wl,-rpath,/usr/local/gfortran/lib -L/usr/local/gfortran/lib -lgfortran -lgcc_ext.10.5 -lquadmath -lm -lclang_rt.osx -lmpicxx -lc++ -Wl,-rpath,/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/7.0.2/lib/darwin -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/7.0.2/lib/darwin -lclang_rt.osx -Wl,-rpath,/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -L/Users/ajit/Downloads/SOFTWAREs/petsc-3.6.3/arch-darwin-c-opt/lib -ldl -lmpi -lpmpi -lSystem -Wl,-rpath,/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/7.0.2/lib/darwin -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/7.0.2/lib/darwin -lclang_rt.osx -ldl
-----------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: makefile
Type: application/octet-stream
Size: 434 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151224/8844f42a/attachment-0003.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: PETSc_helloWorld.F90
Type: application/octet-stream
Size: 621 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151224/8844f42a/attachment-0004.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: MPI_helloWorld.f90
Type: application/octet-stream
Size: 571 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151224/8844f42a/attachment-0005.obj>
More information about the petsc-users
mailing list