[petsc-dev] Multigrid is confusing

Jed Brown jedbrown at mcs.anl.gov
Thu May 24 14:30:11 CDT 2012


On Thu, May 24, 2012 at 2:16 PM, Mark F. Adams <mark.adams at columbia.edu>wrote:

> Is Eisenstat really worth it?
>
> Given a memory centric future and that Jed, even now, is not seeing a win
> .. maybe with a regular grid and R/B ordering you can skip a whole pass
> through the data but for AIJ (I assume that is what ex5 uses) lexagraphic
> ordering is probably a better model and its does not look like a big win to
> me.
>

Maybe Eisenstat should split the matrix storage similar to the
factorization kernels?


>
> Mark
>
> On May 24, 2012, at 2:48 PM, Barry Smith wrote:
>
> >
> > On May 24, 2012, at 1:20 PM, Jed Brown wrote:
> >
> >> On Wed, May 23, 2012 at 2:52 PM, Jed Brown <jedbrown at mcs.anl.gov>
> wrote:
> >> On Wed, May 23, 2012 at 2:26 PM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> >>
> >> Note that you could use -pc_type eisenstat perhaps in this case
> instead. Might save lots of flops?  I've often wondered about doing Mark's
> favorite chebyshev smoother with Eisenstat, seems like it should be a good
> match.
> >>
> >> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> >> [0]PETSC ERROR: No support for this operation for this object type!
> >> [0]PETSC ERROR: Cannot have different mat and pmat!
> >>
> >> Also, I'm having trouble getting Eisenstat to be more than very
> marginally faster than SOR.
> >
> >    There is still a bug related to Eisenstat. If you are trying to use
> it with kspest you are likely hitting the bug.
> >
> >    I will fix the bug when I have time.
> >
> >   Barry
> >
> >>
> >>
> >> I think we should later be getting the eigenvalue estimate by applying
> the preconditioned operator to a few random vectors, then orthogonalizing.
> The basic algorithm is to generate a random matrix X (say 5 or 10 columns),
> compute
> >>
> >> Y = (P^{-1} A)^q X
> >>
> >> where q is 1 or 2 or 3, then compute
> >>
> >> Q R = Y
> >>
> >> and compute the largest singular value of the small matrix R. The
> orthogonalization can be done in one reduction and all the MatMults can be
> done together. Whenever we manage to implement a MatMMult and PCMApply or
> whatever (names inspired by VecMDot), this will provide a very low
> communication way to get the eigenvalue estimates.
> >>
> >>
> >> I want to turn off norms in Chebyshev by default (they are very
> wasteful), but how should I make -mg_levels_ksp_monitor turn them back on?
> I'm already tired of typing -mg_levels_ksp_norm_type unpreconditioned.
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120524/8560357a/attachment.html>


More information about the petsc-dev mailing list