[petsc-dev] ML Reuse redux
Jed Brown
jedbrown at mcs.anl.gov
Mon Aug 29 20:27:02 CDT 2011
On Mon, Aug 29, 2011 at 14:30, John Fettig <john.fettig at gmail.com> wrote:
> Here's the patch. I left in the serial code that creates the matrices
> in PETSc rather than letting ML take care of it, just in case you
> wanted that for any reason.
>
Thanks. I pushed a modified version of this patch. Run with
-pc_ml_reuse_interpolation and SAME_NONZERO_PATTERN to activate. The savings
in setup time are sometimes significant, so perhaps we should make it the
default. We usually choose safe defaults, but wanting to reuse interpolants
might be sufficiently common to warrant. Thoughts?
One thing that I notice is that when I
> use UMFPACK for the coarse grid solve, the preconditioner is different
> (e.g. minres takes more iterations) when I do:
>
>
> KSPSetOperators(gridctx[fine_level].ksp,gridctx[level].A,gridctx[fine_level].A,SAME_NONZERO_PATTERN);
>
> versus
>
>
> KSPSetOperators(gridctx[fine_level].ksp,gridctx[level].A,gridctx[fine_level].A,DIFFERENT_NONZERO_PATTERN);
>
> This shouldn't be the case, should it? It doesn't do this with the
> PETSc builtin lu, so I suspect a bug in the UMFPACK interface.
>
Not if the symbolic factorization is still valid (e.g. nothing caused the
nonzero pattern to change). Do you have a test case for this?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110829/b5d42a67/attachment.html>
More information about the petsc-dev
mailing list