[petsc-users] petsc code crash on AIX with POE

Satish Balay balay at mcs.anl.gov
Wed Nov 16 08:58:32 CST 2011


Perhaps you can build with --with-debugging=1 - and get a proper stack
trace with the debugger? [use a different PETSC_ARCH for this build -
so that the current optimzed build is untouched.]

Also - does sequential run with mpcc_r work?

Satish

On Wed, 16 Nov 2011, Gong Ding wrote:

> Hi,
> I tried to compile my petsc application (c++) on AIX6.1 with POE by IBM xlc (mpCC_r in fact) on PPC6.
> The serial code (with mpi uni) runs ok.
> However, parallel code always crash with error message 
> ERROR: 0031-250  task 0: IOT/Abort trap
> And a core dumped.
> 
> dbx gives little message about the core
> bash-3.00$ dbx ~/packages/genius/bin/genius.AIX core
> Type 'help' for help.
> Core file "core" is older than current program (ignored)
> reading symbolic information ...
> (dbx) where
> ustart() at 0x9fffffff0000240
> 
> 
> The petsc is configured by
> CONFIGURE_OPTIONS = --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=2 --known-memcmp-ok=1 --known-endian=big --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --download-f-blas-lapack=1 --download-mumps=1 --download-blacs=1 --download-parmetis=1 --download-scalapack=1 --download-superlu=1 --with-debugging=0 --with-cc=\"mpcc_r -q64\" --with-fc=\"mpxlf_r -q64\" --with-batch=1 --with-shared-libraries=1 --known-mpi-shared-libraries=1 --with-x=0 --with-pic=1
> 
> It seems petsc example works well. NOTE, petsc is compiled as c library, my application is c++, which links petsc library.
> 
> My code should stable enough, it works well on Linux/windows, and do not have memory problem (checked by valgrind).
> I guess there are some compile/link issue caused the problem.
> Does any one have suggestions?
> 
> Gong Ding
> 
> 
> 
> 
> 



More information about the petsc-users mailing list