[petsc-dev] [GPU] SEGV with: mpiexec -np 2 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type bjacobi -sub_pc_type jacobi -mat_type aijcusp -vec_type cusp

Projet_TRIOU triou at cea.fr
Thu Jul 31 01:39:55 CDT 2014


Hello,

Another crash when using ex2 (src/ksp/ksp/examples/tutorials) in 
parallel on GPU, when playing with
precondtionners:

mpiexec -np 1 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type bjacobi 
-sub_pc_type jacobi
Norm of error 5.71895e-06 iterations 14
mpiexec -np 1 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type bjacobi 
-sub_pc_type jacobi -mat_type aijcusp -vec_type cusp
Norm of error 5.71895e-06 iterations 14
mpiexec -np 2 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type bjacobi 
-sub_pc_type jacobi
Norm of error 5.71895e-06 iterations 14
mpiexec -np 2 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type bjacobi 
-sub_pc_type jacobi -mat_type aijcusp -vec_type cusp
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC 
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to 
find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, 
and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------

Whereas the last one should work exactly as:
mpiexec -np 2 ./ex2 -ksp_type cg -m 10 -n 10 -pc_type jacobi -mat_type 
aijcusp -vec_type cusp
Norm of error 5.71895e-06 iterations 14

Or I miss something ?

Pierre
-- 
*Trio_U support team*
Marthe ROUX (01 69 08 00 02) Saclay
Pierre LEDAC (04 38 78 91 49) Grenoble
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140731/fe774008/attachment.html>


More information about the petsc-dev mailing list