[petsc-users] test failed on 2 processes and segmentation fault while initializing in deal.II

Satish Balay balay at mcs.anl.gov
Thu Aug 4 21:27:19 CDT 2011


On Fri, 5 Aug 2011, huyaoyu wrote:

> Thanks to Satish Balay and Jed Brown. Your advices are pretty good!
> 
> I remove the petsc directory and deal.II directory. Re-build petsc and
> deal.II with the mpi installed system wide. The configuration lines are
> as follows:
> 
> For PETSc:
> ./config/configure.py --with-cc=/usr/bin/mpicc --with-fc=/usr/bin/mpif90
> --download-f-blas-lapack=1 --with-shared
> 
> For deal.II:
> ./configure --enable-shared --disable-threads --with-petsc=$PETSC_DIR
> --with-petsc-arch=$(PETSC_ARCH) --with-p4est=PATH-TO-P4EST --with-mpi
> 
> I will use p4est for grid distribution and I checked the configure
> output of deal.II. The output says deal.II will use /usr/bin/mpicc for
> CC variable and /usr/bin/mpiCC for CXX variable. And deal.II says: 
> =================deal.II configure output=========================
> checking for PETSc library
> directory... /home/huyaoyu/Downloads/petsc-3.1-p8
> checking for PETSc version... 3.1.0
> checking for PETSc library architecture... linux-gnu-c-debug
> checking for PETSc libmpiuni library... not found
> checking for consistency of PETSc and deal.II MPI settings... yes
> checking for PETSc scalar complex... no
> ================= end of deal.II configure output=================
> 
> After the compilation of PETSc and deal.II, I tried to compile the
> specific example program of deal.II which will use PETSc. And everything
> works well. No segmentation error any more! Great!
> 
> However, the test of PETSc trigger the same error while trying to
> perform task in 2 process. Is it because I am using PETSc on a single
> machine but not a cluster of computers?

You mean the compiler warning for the fortan example in 'make test'?

You can ignore that.

Satish

> 
> Again, thanks to Satish Balay and Jed Brown, you are great!
> 
> HuYaoyu.
> 
> > > --------------Error detected during compile or
> > > link!-----------------------
> > > See
> > > http://www.mcs.anl.gov/petsc/petsc-2/documentation/troubleshooting.html
> > > /home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/bin/mpif90 -c
> > > -fPIC  -Wall -Wno-unused-variable -g
> > > -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> > > -I/home/huyaoyu/Downloads/petsc-3.1-p8/include
> > > -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> > > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/lib
> > > -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> > > -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> > > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/lib    -o ex5f.o ex5f.F
> > > ex5f.F:92.72:
> > >
> > >      call PetscOptionsGetReal(PETSC_NULL_CHARACTER,'-par',lambda,
> > >
> > > 1
> > > Warning: Line truncated at (1)
> > >
> > 
> > This is harmless, the Fortran compiler is creating a spurious warning. The
> > problem has been fixed in gcc-4.6.
> > 
> > 
> > > huyaoyu at ubuntu:~/Downloads/deal.II-non-threads/examples/step-17
> > > $ ./step-17
> > > [ubuntu:23497] *** Process received signal ***
> > > [ubuntu:23497] Signal: Segmentation fault (11)
> > > [ubuntu:23497] Signal code: Address not mapped (1)
> > > [ubuntu:23497] Failing at address: 0x44000098
> > > [ubuntu:23497] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0xfc60)
> > > [0x7f111b24ac60]
> > > [ubuntu:23497] [ 1] /usr/lib/libmpi.so.0(MPI_Comm_rank+0x5e)
> > > [0x7f111adbf3ce]
> > > [ubuntu:23497]
> > > [ 2]
> > > /home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib/libpetsc.so(PetscInitialize+0x5d1)
> > > [0x7f11190b97df]
> > > [ubuntu:23497] [ 3] ./step-17(main+0x3b) [0x42539f]
> > > [ubuntu:23497] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main
> > > +0xff) [0x7f1119793eff]
> > > [ubuntu:23497] [ 5] ./step-17() [0x4252a9]
> > > [ubuntu:23497] *** End of error message ***
> > > Segmentation fault
> > >
> > > After debug the deal.II program I found that it failed at the following
> > > line
> > >
> > > pinit.c line 578:
> > > ierr = MPI_Comm_rank(MPI_COMM_WORLD,&PetscGlobalRank);CHKERRQ(ierr);
> > >
> > 
> > Can you send the output of "ldd dealii-example"?
> > 
> > This normally happens when a different MPI is linked from that which is
> > compiled (or it's inconsistent between PETSc and deal.ii).
> 
> 
> 
> 
> 
> 



More information about the petsc-users mailing list