KSP different results with the default and direct LU
Barry Smith
bsmith at mcs.anl.gov
Thu Oct 22 08:36:11 CDT 2009
On Oct 22, 2009, at 6:08 AM, Umut Tabak wrote:
> Dear all,
>
> I am trying get myself more acquainted with PETSc.
>
> I am trying to solve a linear equation system which is conditioned
> badly. If I use the approach as given in ex1.c of the KSP section(by
> default, I read from the manual that it uses restarted GMRES ,
> preconditioned with JACOBI in ex1.c which I am not very aware of
> these numerical procedures that in deep). I get some wrong results
> for the solution.
When testing always run with -ksp_converged_reason or call
KSPGetConvergedReason() after KSP solve to determine if PETSc thinks
it has actually solved the system. Also since iterative solvers only
solve to some tolerance the answer may not be "wrong" it may just be
accurate up to the tolerance and with ill-conditioned matrices a tight
tolerance on the residual norm may still be a loose tolerance on the
norm of the error.
>
> I checked my results with Matlab, If I use the command line options
> as suggested on page 68 of the manual, to solve it directly, -
> ksp_type preonly -pc_type lu, I can both get the right solution and
> it is faster than the iterative solution which is used by default.
> So as far as I can follow from the mailing list, iterative methods
> and preconditioners are problem dependent, so my question would be,
> should I find the right approach/solver combination by some trial
> and error by using different combinations that are outlined in the
> manual?(This can also be dependent on the problem types that I would
> like to solve as well I suppose, for the moment, quite badly
> conditioned.)
>
General purpose iterative solvers, like in PETSc, used willy-
nilly for very ill-conditioned linear systems, are basically
worthless. You need to either stick to direct solvers or understand
the types of iterative solvers that are used in the field of expertise
for your class of problems. For example, if your problems come from
semi-conductor simulations then you need to understand the literature
of iterative solvers for semi-conductors before proceeding. For
reasonably well-conditioned solvers where a variety of iterative
methods "just work", that is converge ok, you can try them all to see
what is fastest on your machine, but for nasty matrices this "trial
and error" is a waste of time, because almost none of the iterative
solvers will even converge, those that converge will not always
converge (they are not reliable) and they will be slower than direct
solvers. We've had good success with the MUMPS parallel direct solver
and recommend trying the other ones as well. If your goal is to run
some simulation (and not do "research" on iterative solvers for nasty
matrices) I would just determine the best direct solver and use it
(and live with the memory usage and time it requires).
Barry
> Any pointers are appreciated.
>
> Best regards,
> Umut
>
>
More information about the petsc-users
mailing list