[petsc-users] general problem with application based on PETSc

Karl Rupp rupp at mcs.anl.gov
Sun Jan 19 08:10:02 CST 2014


Hi,

ah, sorry, the matrix view was okay, I'm still confused from a recent 
debugging session. If you want to verify that the distribution is okay, 
verify the return values of MatGetOwnerShipRange() to get the rows 
'owned' by the respective process. In your example you should get {0,1} 
for the first MPI rank and {2,3} for the second.


> Example:
> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex11.c.html
> Run it with
>   mpirun -np 2 ./Release/spin-scenarios-petsc  -n 2
> This is the results:
> 1+1i
> 1+1i
> Norm of error 1.19381e-16 iterations 4
> Vector Object: 2 MPI processes
>    type: mpi
> Process [0]
> 4.16334e-17
> 0 + 1.38778e-17 i
> Process [1]
> 0
> 1.11022e-16
> Matrix Object: 1 MPI processes
>    type: mpiaij
> row 0: (0, -7.11111 + 0.0800036 i) (1, -1)  (2, -1)
> row 1: (0, -1)  (1, -7.11111 + 0.00111359 i) (3, -1)
> row 2: (0, -1)  (2, -7.11111 + 0.0601271 i) (3, -1)
> row 3: (1, -1)  (2, -1)  (3, -7.11111 + 0.0943018 i)
>
> I am not sure about it, but the results is the same as I tried before
> with c++ class.

The output looks okay to me - sorry for the confusion.

> By the way, I install PETSc using
> --with-mpi-dir=/usr/local --with-scalar-type=complex
> --with-clanguage=cxx --with-fortran-kernels=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2013_sp1.0.080/mkl
> --with-debugging=no

Please perform *all* development using --with-debugging=yes and only use 
optimized builds once everything is working. Chances are high that this 
will save you a lot of headache :-)

Best regards,
Karli



More information about the petsc-users mailing list