[petsc-users] Problems with preconditioners, which one?

Jed Brown jed at 59A2.org
Thu Aug 5 11:11:22 CDT 2010


On Thu, 05 Aug 2010 11:55:13 -0400, Filippo Spiga <filippo.spiga at disco.unimib.it> wrote:
> - umpfpack: sometimes a values like '-1.3363470411775628e-16' or 
> '1.1111231822598224e-15' appears instead of '0.0000000000000000e+00'

This is the nature of floating point arithmetic.

> - mumps: for large system (~30k x 30k matrix) it seems to be stuck after 
> a while. I used 8 processors. I cannot understand why this happens but 
> I'm investigating...

Interesting, other people solving Stokes-like problems with MUMPS
occasionally experience this.  Let us (or the MUMPS developers) know if
you make progress.  It would also be good to have a modest size test
case that produces the hang on different machines.

> SuperLU_DIST seems to work perfectly even if is slower that the other 
> two. But I didn't do a exaustive performance comparison. Now my idea is 
> do a lot of tests to measure the accuracy and the timing varying the 
> dimension of the problem (I have different cubes to test), ksp_atol and 
> ksp_rtol.  I'm able for all simple cube cases to costruct by myself the 
> analitical solution so I will compare the computed one with the 
> "analitical" one. That's perfect to tune the solver, isn't it?

That measure of accuracy includes discretization error which is likely
to be much larger than solve error (unless the solver fails, producing
an erroneous result).  You can check that the residual (e.g. computed by
SNES) actually gets very small relative to the applied forcing.  Your
goal should be a accuracy approaching epsilon * condition_number (with
the direct solver, or an iterative solver with very tight tolerances),
which I think is somewhere around 1e-12 for your problem and double
precision.

Jed


More information about the petsc-users mailing list