[petsc-users] MatMatSolve in sequential call

Hong Zhang hzhang at mcs.anl.gov
Tue Aug 21 11:14:23 CDT 2012


Alexander :

Testing your matrix A.dat
using petsc-3.3/src/ksp/ksp/examples/tutorials/ex10.c,
I find that A is numerically singular:

a) petsc sequential lu:
./ex10 -f0 A.dat -ksp_type preonly -pc_type lu -rhs 0
[0]PETSC ERROR: Detected zero pivot in LU factorization:
see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
[0]PETSC ERROR: Zero pivot row 48 value 0 tolerance 2.22045e-14!

b) mumps lu with np=8:
mpiexec -n 8 ./ex10 -f0 A.dat -ksp_type preonly -pc_type lu -rhs 0
-pc_factor_mat_solver_package mumps
...
[0]PETSC ERROR: Error in external library!
[0]PETSC ERROR: Error reported by MUMPS in numerical factorization phase:
INFO(1)=-1, INFO(2)=1
...

c)
superlu_dist lu with np=8:
mpiexec -n 8 ./ex10 -f0 A.dat -ksp_type preonly -pc_type lu -rhs 0
-pc_factor_mat_solver_package superlu_dist
Number of iterations =   1
Residual norm 0.0103982

mumps cholesky with np=8:
mpiexec -n 8 ./ex10 -f0 A.dat -ksp_type preonly -pc_type cholesky -rhs 0
-pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 122744

With such large residual norms, the computed solution cannot be trusted :-(

d) using -ksp_type gmres with
   pc=ilu: zero pivot
   pc=jacobi: iteration stagnates ..

I'm convinced that matrix A is corrupted or obtained from an incorrect
numerical model.
Also note, your matrix A has size of approx. 60k, which is too large for
sequential direct solver.

The error on MatMatSolve() when calling
' -pc_type cholesky -pc_factor_mat_solver_package mumps'
is a bug in our petsc-mumps interface. I've pushed a fix to petsc-3.3
http://petsc.cs.iit.edu/petsc/releases/petsc-3.3/rev/8badc49a596e

Thanks for your report.

Hong


On 13.08.2012 19:01, Hong Zhang wrote:
>
>> Alexander :
>> I need a simple code from you to reproduce the error for debugging.
>>
>
> Hong,
>
> Attached you will find test.c that should reproduce this behavior with
> petsc-3.3-p2.
> Just run it with 1 and 2 procs as following:
> ./mpirun -n 1 test -ksp_type preonly -pc_type cholesky
> -pc_factor_mat_solver_package mumps -ksp_view
>
> Here is matrix I tested it on (15 MB): https://dl.dropbox.com/u/**
> 60982984/A.dat <https://dl.dropbox.com/u/60982984/A.dat>
>
>
>  If mumps' solver is activated, then you should get error (sequential and
>> parallel)
>> [0]PETSC ERROR: No support for this operation for this object type!
>> [0]PETSC ERROR: MatMatSolve_MUMPS() is not implemented yet!
>>
>> If petsc's direct solver is activated, then you should not be able to run
>> it in parallel.
>>
>
> Please note, that it works correctly for nproc > 1 since 1 year or so!
> So I don't quite understand why there should be error thrown.
> I just occasionally decided to test something in sequential version and
> this thing happened.
>
>
>
>> Please check ~petsc/src/mat/examples/tests/**ex125.c
>> to see if you used same calling procedure as this example.
>> Note: I fixed a bug in this example and patched petsc-3.3.
>> The update copy of ex125.c is attached for your convenience.
>>
>> Hong
>>
>>
>
> --
> Regards,
> Alexander
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120821/9c8cab00/attachment.html>


More information about the petsc-users mailing list