[petsc-users] petsc 3.2-p5 test error

Barry Smith bsmith at mcs.anl.gov
Fri Nov 11 15:46:26 CST 2011


   The code where it is crashing is very simple and works on all the other MPI implementations we've run on. It is 

  ierr = MPI_Comm_split(comm,0,duprank,&dupcomm);CHKERRQ(ierr);
 
  ierr = PetscCommDuplicate(dupcomm,&psubcomm->dupparent,PETSC_NULL);CHKERRQ(ierr);
  ierr = PetscCommDuplicate(subcomm,&psubcomm->comm,PETSC_NULL);CHKERRQ(ierr);
  ierr = MPI_Comm_free(&dupcomm);CHKERRQ(ierr);
  ierr = MPI_Comm_free(&subcomm);CHKERRQ(ierr);

  So my first guess is that this is a bug in the IntelMPI.   What kind of system are you running on? I would recommend trying ./configure with --download-mpich and see if that runs correctly in parallel. If so that hints at an IntelMPI problem. Is the IntelMPI version you are using the most recent with all patches applied?

   Please send future emails on this issue to petsc-maint at mcs.anl.gov

   Barry





On Nov 11, 2011, at 3:09 PM, Mohamed Adel wrote:

> Dear all,
> 
> I'm trying to compile petsc version 3.2-p5 with IntelMPI-3.2.
> The configuration and compilation goes fine, while the test crashes with the following error.
> -------------------------------------------------------------------------------------------------
> $ make PETSC_DIR=/opt/petsc-3.2-p5/intel test
> Running test examples to verify correct installation
> Using PETSC_DIR=/opt/petsc-3.2-p5/intel and PETSC_ARCH=arch-linux2-c-debug
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
> See http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html
> lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> [0]PETSC ERROR: Petsc_DelComm() line 430 in src/sys/objects/pinit.c
> [1]PETSC ERROR: Petsc_DelComm() line 430 in src/sys/objects/pinit.c
> [1]PETSC ERROR: PetscSubcommCreate_interlaced() line 288 in src/sys/objects/subcomm.c
> [1]PETSC ERROR: PetscSubcommSetType() line 71 in src/sys/objects/subcomm.c
> [1]PETSC ERROR: PCSetUp_Redundant() line 77 in src/ksp/pc/impls/redundant/redundant.c
> [1]PETSC ERROR: PCSetUp() line 819 in src/ksp/pc/interface/precon.c
> [1]PETSC ERROR: KSPSetUp() line 260 in src/ksp/ksp/interface/itfunc.c
> [1]PETSC ERROR: PCSetUp_MG() line 678 in src/ksp/pc/impls/mg/mg.c
> [1]PETSC ERROR: PCSetUp() line 819 in src/ksp/pc/interface/precon.c
> [1]PETSC ERROR: KSPSetUp() line 260 in src/ksp/ksp/interface/itfunc.c
> [1]PETSC ERROR: KSPSolve() line 379 in src/ksp/ksp/interface/itfunc.c
> [1]PETSC ERROR: SNES_KSPSolve() line 3396 in src/snes/interface/snes.c
> [1]PETSC ERROR: SNESSolve_LS() line 190 in src/snes/impls/ls/ls.c
> [1]PETSC ERROR: SNESSolve() line 2676 in src/snes/interface/snes.c
> [1]PETSC ERROR: DMMGSolveSNES() line 540 in src/snes/utils/damgsnes.c
> [1]PETSC ERROR: DMMGSolve() line 331 in src/snes/utils/damg.c
> [1]PETSC ERROR: main() line 160 in src/snes/examples/tutorials/ex19.c
> [cli_1]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 805896965) - process 1
> [0]PETSC ERROR: PetscSubcommCreate_interlaced() line 288 in src/sys/objects/subcomm.c
> [0]PETSC ERROR: PetscSubcommSetType() line 71 in src/sys/objects/subcomm.c
> [0]PETSC ERROR: PCSetUp_Redundant() line 77 in src/ksp/pc/impls/redundant/redundant.c
> [0]PETSC ERROR: PCSetUp() line 819 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUp() line 260 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: PCSetUp_MG() line 678 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCSetUp() line 819 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUp() line 260 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: KSPSolve() line 379 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: SNES_KSPSolve() line 3396 in src/snes/interface/snes.c
> [0]PETSC ERROR: SNESSolve_LS() line 190 in src/snes/impls/ls/ls.c
> [0]PETSC ERROR: SNESSolve() line 2676 in src/snes/interface/snes.c
> [0]PETSC ERROR: DMMGSolveSNES() line 540 in src/snes/utils/damgsnes.c
> [0]PETSC ERROR: DMMGSolve() line 331 in src/snes/utils/damg.c
> [0]PETSC ERROR: main() line 160 in src/snes/examples/tutorials/ex19.c
> [cli_0]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 590597) - process 0
> rank 1 in job 50  login02.local_33989   caused collective abort of all ranks
>  exit status of rank 1: return code 5
> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process
> Completed test example
> -------------------------------------------------------------------------------------------------
> 
> Any idea about what might be wrong with the test?
> I configured petsc with the following configurations.
> ./configure --prefix=/opt/petsc-3.2-p5/intel --with-shared-libraries=1 --with-blas-lapack-dir=/opt/intel/Compiler/11.0/074/mkl/lib/em64t
> 
> 
> thanks in advance,
> --ma



More information about the petsc-users mailing list