[petsc-users] Problems with preconditioners, which one?
Filippo Spiga
filippo.spiga at disco.unimib.it
Thu Aug 5 10:55:13 CDT 2010
Dear Jed, Hong, Barry and Matthew,
first of all thank ou very much for you help. I really appreciate it!
I tired all your suggestions and now I'm able to solve my problem. Good
news! The system reachs the convergence using 'superlu_dist', 'umfpack'
and 'mumps'. However I experienced some problems using umfpack and
mumps. In details:
- umpfpack: sometimes a values like '-1.3363470411775628e-16' or
'1.1111231822598224e-15' appears instead of '0.0000000000000000e+00'
- mumps: for large system (~30k x 30k matrix) it seems to be stuck after
a while. I used 8 processors. I cannot understand why this happens but
I'm investigating...
SuperLU_DIST seems to work perfectly even if is slower that the other
two. But I didn't do a exaustive performance comparison. Now my idea is
do a lot of tests to measure the accuracy and the timing varying the
dimension of the problem (I have different cubes to test), ksp_atol and
ksp_rtol. I'm able for all simple cube cases to costruct by myself the
analitical solution so I will compare the computed one with the
"analitical" one. That's perfect to tune the solver, isn't it?
Once again, thank you very much for your support.
Regards
Jed Brown ha scritto:
> You can use mumps and superlu_dist in parallel. Or a domain
> decomposition method with direct subdomain solves, but then my earlier
> comment regarding being careful about the partition applies.
--
Filippo SPIGA
«Nobody will drive us out of Cantor's paradise.»
-- David Hilbert
More information about the petsc-users
mailing list