[petsc-users] SuperLU_dist computes rubbish
Hong
hzhang at mcs.anl.gov
Sun Nov 15 20:53:29 CST 2015
Jan:
I can reproduce reported behavior using
petsc/src/ksp/ksp/examples/tutorials/ex10.c on your mat.dat and rhs.dat.
Using petsc sequential lu with default ordering 'nd', I get
./ex10 -f0 mat.dat -rhs rhs.dat -pc_type lu
Number of iterations = 0
Residual norm 0.0220971
Changing to
./ex10 -f0 mat.dat -rhs rhs.dat -pc_type lu -pc_factor_mat_ordering_type
natural
Number of iterations = 1
Residual norm < 1.e-12
Back to superlu_dist, I get
mpiexec -n 3 ./ex10 -f0 /homes/hzhang/tmp/mat.dat -rhs
/homes/hzhang/tmp/rhs.dat -pc_type lu -pc_factor_mat_solver_package
superlu_dist
Number of iterations = 4
Residual norm 25650.8
which uses default ordering (-ksp_view)
Row permutation LargeDiag
Column permutation METIS_AT_PLUS_A
Run it with
mpiexec -n 3 ./ex10 -f0 mat.dat -rhs rhs.dat -pc_type lu
-pc_factor_mat_solver_package superlu_dist -mat_superlu_dist_rowperm
NATURAL -mat_superlu_dist_colperm NATURAL
Number of iterations = 1
Residual norm < 1.e-12
i.e., your problem is sensitive to matrix ordering, which I do not know why.
I checked condition number of your mat.dat using superlu:
./ex10 -f0 mat.dat -rhs rhs.dat -pc_type lu -pc_factor_mat_solver_package
superlu -mat_superlu_conditionnumber
Recip. condition number = 1.137938e-03
Number of iterations = 1
Residual norm < 1.e-12
As you see, matrix is well-conditioned. Why is it so sensitive to matrix
ordering?
Hong
Using attached petsc4py code, matrix and right-hand side, SuperLU_dist
> returns totally wrong solution for mixed Laplacian:
>
> $ tar -xzf report.tar.gz
> $ python test-solve.py -pc_factor_mat_solver_package mumps
> -ksp_final_residual
> KSP final norm of residual 3.81865e-15
> $ python test-solve.py -pc_factor_mat_solver_package umfpack
> -ksp_final_residual
> KSP final norm of residual 3.68546e-14
> $ python test-solve.py -pc_factor_mat_solver_package superlu_dist
> -ksp_final_residual
> KSP final norm of residual 1827.72
>
> Moreover final residual is random when run using mpirun -np 3. Maybe
> a memory corruption issue? This is reproducible using PETSc 3.6.2 (and
> SuperLU_dist configured by PETSc) and much older, see
> http://fenicsproject.org/pipermail/fenics-support/2014-March/000439.html
> but has never been reported upstream.
>
> The code for assembling the matrix and rhs using FEniCS is also
> included for the sake of completeness.
>
> Jan
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151115/4162dd7c/attachment.html>
More information about the petsc-users
mailing list