[petsc-dev] Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
Satish Balay
balay at mcs.anl.gov
Sun Oct 9 23:35:34 CDT 2016
On Sun, 9 Oct 2016, Satish Balay wrote:
> On Sun, 9 Oct 2016, Antonio Trande wrote:
>
> > There is still a bad exit status of OpenMPI libs on *32bit architectures
> > only*:
> >
> > make: Entering directory
> > '/builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir'
> > Running test examples to verify correct installation
> > Using PETSC_DIR=/builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir and
> > PETSC_ARCH=i386
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI
> > process
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > [0]PETSC ERROR: #1 PetscOptionsInsert() line 717 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/options.c
> > [0]PETSC ERROR: #2 PetscInitialize() line 871 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/pinit.c
> > -------------------------------------------------------
> > Primary job terminated normally, but 1 process returned
> > a non-zero exit code.. Per user-direction, the job has been aborted.
> > -------------------------------------------------------
> > --------------------------------------------------------------------------
> > mpiexec detected that one or more processes exited with non-zero status,
> > thus causing
> > the job to be terminated. The first process to do so was:
> > Process name: [[16865,1],0]
> > Exit code: 1
> > --------------------------------------------------------------------------
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI
> > processes
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > [0]PETSC ERROR: #1 PetscOptionsInsert() line 717 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/options.c
> > [0]PETSC ERROR: #2 PetscInitialize() line 871 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/pinit.c
> > [1]PETSC ERROR: #1 PetscOptionsInsert() line 717 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/options.c
> > [1]PETSC ERROR: #2 PetscInitialize() line 871 in
> > /builddir/build/BUILD/petsc-3.7.4/buildopenmpi_dir/src/sys/objects/pinit.c
> > -------------------------------------------------------
> > Primary job terminated normally, but 1 process returned
> > a non-zero exit code.. Per user-direction, the job has been aborted.
> > -------------------------------------------------------
> > --------------------------------------------------------------------------
> > mpiexec detected that one or more processes exited with non-zero status,
> > thus causing
> > the job to be terminated. The first process to do so was:
> > Process name: [[16891,1],0]
> > Exit code: 1
> > --------------------------------------------------------------------------
> > Fortran example src/snes/examples/tutorials/ex5f run successfully with 1
> > MPI process
> > Completed test examples
> > =========================================
> >
> > Full build log (32bit):
> > https://copr-be.cloud.fedoraproject.org/results/sagitter/petsc/fedora-rawhide-i386/00462843-petsc/build.log.gz
> >
> > Full build log (64bit):
> > https://copr-be.cloud.fedoraproject.org/results/sagitter/petsc/fedora-rawhide-x86_64/00462843-petsc/build.log.gz
> >
> >
>
> >>>>>>>>
> sundials:
> Includes: -I/usr/include/openmpi-i386
> Library: -L/usr/lib/openmpi/lib -lsundials_nvecparallel -L/usr/lib -lsundials_cvode
> <<<<<<<
>
> Don't know if this is the reason - but I would avoid -L/usr/lib in
> here [and let the compiler locate it on its own].
>
> Also -L/usr/lib/openmpi/lib and -I/usr/include/openmpi-i386 are in
> default compiler search path for
> /usr/lib/openmpi/bin/mpicc/mpicxx/mpif90 - so I would also not specify
> them..
Ops - looks like -I/usr/include/openmpi-i386 can be skipped - but not -L/usr/lib/openmpi/lib
balay at thrash^~/junk/mpi $ mpicc.mpich -show cpi.c -lm
cc -D_FORTIFY_SOURCE=2 -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security -Wl,-Bsymbolic-functions -Wl,-z,relro cpi.c -lm -I/usr/include/mpich -L/usr/lib/x86_64-linux-gnu -lmpich -lopa -lmpl -lrt -lcr -lpthread
balay at thrash^~/junk/mpi $ mpicc.openmpi -show cpi.c -lm
gcc cpi.c -lm -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi -pthread -L/usr//lib -L/usr/lib/openmpi/lib -lmpi -ldl -lhwloc
balay at thrash^~/junk/mpi $
Oh well..
Satish
>
> >>>>>>>
> BLAS/LAPACK: -lopenblas -lopenblas
> <<<<<<<
>
> BTW: you could replace '--with-blas-lib=libopenblas.so --with-lapack-lib=libopenblas.so' with a single '--with-blas-lapack-lib=libopenblas.so'
> to avoid this duplicate listing.
>
> Also - you have '--with-64-bit-indices=0 ' configure option listed twice. But this should not cause any harm..
>
> And I see the sequential build also crashing. Perhaps you can run this in valgrind again - to see what
> the issue is.. [that line corresonds to a call to MPI_Comm_rank() - and I don't see any reason why this should not work]
>
> [0]PETSC ERROR: #1 PetscOptionsInsert() line 717 in /builddir/build/BUILD/petsc-3.7.4/petsc-3.7.4/src/sys/objects/options.c
> [0]PETSC ERROR: #2 PetscInitialize() line 871 in /builddir/build/BUILD/petsc-3.7.4/petsc-3.7.4/src/sys/objects/pinit.c
>
>
>
> Satish
>
More information about the petsc-dev
mailing list