[petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues
Mark Adams
mfadams at lbl.gov
Wed Oct 5 10:05:18 CDT 2022
On Tue, Oct 4, 2022 at 5:20 PM feng wang <snailsoar at hotmail.com> wrote:
> Hi Mark,
>
> Thanks for your reply. Below is the output if I call KSPComputeEigenvalues
>
> 0.330475 -0.0485014
> 0.521211 0.417409
> 0.684726 -0.377126
> 0.885941 0.354342
> 0.957845 -0.0508471
> 0.964676 -0.241642
> 1.05921 0.0742963
> 1.82065 -0.0209096
>
> I have the following questions:
>
> - These eigenvalues are sorted according to the magnitudes. so
> "lowest" means smallest magnitude and "highest" means largest magnitude in
> your previous email?
>
> Oh, I was talking about "extreme" eigen values (an option). This is all of
them.
>
> - I understand that if the preconditioner is perfect, all the
> eigenvalues should be (1,0). Since my preconditioner is not perfect, to
> understand its performance, is it correct to say that, I need to keep an
> eye on the eigenvalues whose distance to (1,0) are the furthest?
>
> I'm not sure what you mean by "distance to (1,0)".
First, these are the iegenvalues of the sysetm that Krylov project to. They
are within the bounds of the true extreme eigenvalues but they are not
eigenues of the actuall preconditioned system
I just look at the ratio of the highest to lowest. The condition number.
This will converge to the true value from below.
>
> -
> - How does petsc decides how many eigenvalues to output in
> KSPComputeEigenvalues.
>
> It is all of them for the projected system, which is the size of the
number of iterations.
>
> - I am solving a set of linear systems, sometimes
> KSPComputeEigenvalues outputs 8 eigenvalues, sometimes it outputs just 2
> eigenvalues.
> - In the output which I showed above, are these the ones with the
> smallest magnitude and also the ones with the largest magnitudes? and
> what's between are all ignored? If this is the case, which ones are the
> "lowest" and which ones are the "highest"?
>
>
These seem to be sorted. You can also ask for "Extreme" eigenvalues and
just get these two that you can use for the condition number estimate. That
is the most common use.
Mark
>
> -
>
> Thanks for your help and sorry for so many questions,
> Feng
>
>
>
>
> ------------------------------
> *From:* Mark Adams <mfadams at lbl.gov>
> *Sent:* 04 October 2022 17:18
> *To:* feng wang <snailsoar at hotmail.com>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] clarification on extreme eigenvalues from
> KSPComputeEigenvalues
>
> The extreme eigenvalues are the lowest and highest.
> A perfect preconditioner would give all eigenvalues = 1.0
>
> Mark
>
> On Tue, Oct 4, 2022 at 1:03 PM feng wang <snailsoar at hotmail.com> wrote:
>
> Dear All,
>
> I am using the KSPComputeEigenvalues to understand the performance of my
> preconditioner, and I am using the right-preconditioned GMRES with ASM. In
> the user guide, it says this routine computes the extreme eigenvalues of
> the preconditioned operator. If I understand it correctly, these
> eigenvalues are the ones furthest away from (1,0)? If the preconditioning
> is perfect, all the eigenvalues should be (1,0).
>
> Thanks,
> Feng
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221005/aac7c725/attachment.html>
More information about the petsc-users
mailing list