[petsc-users] test failed on 2 processes and segmentation fault while initializing in deal.II

Satish Balay balay at mcs.anl.gov
Thu Aug 4 10:24:20 CDT 2011


On Thu, 4 Aug 2011, huyaoyu wrote:

> Hi everyone,
> 
> I am new to PETSc. The reason I should use PETSc is because another
> package named deal.II will get help from PETSc. The computer has ubuntu
> 11.04 64bit installed. deal.II version is 7.0
> 
> So I downloaded the PETSc source code (3.1-p8) and do the configuration
> (after setting the environment variables correctly) like this:
> ./conf/configure.py --with-cc=gcc --with-fc=gfortran
> -download-f-blas-lapack=1 -download-mpich=1 --with-shared
> 
> The configuration went well I supposed.
> 
> Then I invoked the command:
> make all test
> 
> But the test log (that appeared on the shell) showed that the test had
> failed when it was trying to perform a task in 2 processes.
> =========================================
> Running test examples to verify correct installation
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1
> MPI process
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2
> MPI processes
> --------------Error detected during compile or
> link!-----------------------
> See
> http://www.mcs.anl.gov/petsc/petsc-2/documentation/troubleshooting.html
> /home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/bin/mpif90 -c
> -fPIC  -Wall -Wno-unused-variable -g
> -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> -I/home/huyaoyu/Downloads/petsc-3.1-p8/include
> -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/lib
> -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> -I/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/include
> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/lib    -o ex5f.o ex5f.F
> ex5f.F:92.72:
> 
>       call PetscOptionsGetReal(PETSC_NULL_CHARACTER,'-par',lambda,      
> 
> 1
> Warning: Line truncated at (1)
> 
> WITH SOME OTHER WARNINGS LIKE THE ABOVE
> ...
> THEN
> 
> /home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/bin/mpif90 -fPIC
> -Wall -Wno-unused-variable -g  -o ex5f ex5f.o
> -Wl,-rpath,/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib
> -L/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib -lpetsc
> -lX11
> -Wl,-rpath,/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib
> -L/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib -lflapack
> -lfblas -lnsl -lrt
> -Wl,-rpath,/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib
> -L/home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib
> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib
> -Wl,-rpath,/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2
> -L/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2
> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl
> -lmpich -lrt -lmpi -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s -lpthread
> -lmpichf90 -lmpi_f90 -lmpi_f77 -lgfortran -lm -lm -lm -lm -ldl -lmpich
> -lrt -lmpi -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s -lpthread -ldl 
> /bin/rm -f ex5f.o
> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1
> MPI process
> Completed test examples
> ==================================================================
> 
> I just ignore it because I think if PETSc succeeds on 1 process and that
> is all I want for now. 1 process is enough for deal.II to work on my
> computer. (deal.II may work on cluster, but I only have on computer
> right now. And I just want to see some particular tutorial programs of
> deal.II that only work on cluster).Then after I re-build deal.II this
> PETSc support, I triggered the deal.II program that will use PETSc
> library. And I got the following segmentation fault:
> 
> huyaoyu at ubuntu:~/Downloads/deal.II-non-threads/examples/step-17
> $ ./step-17 
> [ubuntu:23497] *** Process received signal ***
> [ubuntu:23497] Signal: Segmentation fault (11)
> [ubuntu:23497] Signal code: Address not mapped (1)
> [ubuntu:23497] Failing at address: 0x44000098
> [ubuntu:23497] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0xfc60)
> [0x7f111b24ac60]
> [ubuntu:23497] [ 1] /usr/lib/libmpi.so.0(MPI_Comm_rank+0x5e)

^^^^^^^^^^^^ Looks like this is system install MPI - not MPI from --download-mpich.

> [0x7f111adbf3ce]
> [ubuntu:23497]
> [ 2] /home/huyaoyu/Downloads/petsc-3.1-p8/linux-gnu-c-debug/lib/libpetsc.so(PetscInitialize+0x5d1) [0x7f11190b97df]
> [ubuntu:23497] [ 3] ./step-17(main+0x3b) [0x42539f]
> [ubuntu:23497] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main
> +0xff) [0x7f1119793eff]
> [ubuntu:23497] [ 5] ./step-17() [0x4252a9]
> [ubuntu:23497] *** End of error message ***
> Segmentation fault
> 
> After debug the deal.II program I found that it failed at the following
> line
> 
> pinit.c line 578:
> ierr = MPI_Comm_rank(MPI_COMM_WORLD,&PetscGlobalRank);CHKERRQ(ierr);
> 
> gdb tells me that: 
> MPI_COMM_WORLD  = 1140850688 (0x44000000)
> PetscGlobalRank = -1
> And MPI_Comm_rank failed at some where the disassembly code is
> "testb $0x90,0x98(%rbx)"
> 
> The deal.II program triggers the same error every time it runs.
> 
> Then I remove the whole PETSc directory and extract the source files
> again. Configure PETSc use the following line
> 
> --with-cc=mpicc --with-fc=mpif90 --download-f-blas-lapack=1
> --download-mpich=1 --with-shared
> 
> Again, the PETSc test failed on 2 processes exactly the same with
> before. And the deal.II program gave the segmentation error at the
> identical place(pinit.c, ierr =
> MPI_Comm_rank(MPI_COMM_WORLD,&PetscGlobalRank);CHKERRQ(ierr);).
> 
> I searched the mailing list and got this
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2008-July/003194.html
> The guys were talking about a problem just likes mine. They said that a
> fresh re-installation of PETSc will solve the problem. But the solution
> they got dose not work for me.
> 
> How can I fix it? It seems that the installation of PETSc may be wrong.
> I have mpich2 installed on the system, it that matter? And I tried to
> configure PETSc without the argument "-download-mpich=1" then compile,
> but the same test error and segmentation error appeared.
> 
> Does it have some thing to do with the 64bit issue? Or, do I have to
> post an email to petsc-maint at mcs.anl.gov along with the configure.log
> file attached?
> 
> Any help, thanks!

Avoid mixing multiple MPI impls. use a single MPI for all packages [PETSc, deal.II etc..]

So remove the currently install PETSc with mpich - and start from
scratch with system installed MPI.

i.e
rm -rf linux-gnu-c-debug externalpackages
./configure --with-cc=mpicc --with-fc=mpif90 --download-f-blas-lapack=1 --with-shared

Satish


> 
> Best,
> 
> HuYaoyu
> 
> 



More information about the petsc-users mailing list