[petsc-users] default orthogonalization in gmres
Jed Brown
jedbrown at mcs.anl.gov
Mon Jul 15 18:07:41 CDT 2013
Umut Tabak <u.tabak at tudelft.nl> writes:
> On 07/15/2013 11:57 PM, Jed Brown wrote:
>>
>> It's not a factor of 2, it's a factor of k where k is the size of the
>> subspace. Classical Gram-Schmidt needs one reduction per iteration
>> (normalization can be hidden), but modified needs k reductions.
> Dear Jed,
>
> Could you please explain a bit more on what you mean by
>
> + reduction
MPI_Allreduce, which is needed in parallel as part of computing a norm
or dot product.
> + normalization can be hidden
Gram-Schmidt has a bunch of projections to make the vector orthogonal,
then normalization. The reduction needed for normalization is easy to
thread into the next iteration, so I'm ignoring it in this performance
model.
> On a problem that I am working on, cgs and mgs have a subtle difference.
That's common. If it's a big difference, it usually means the system is
ill-conditioned and you should probably work on your preconditioner.
> I would like to learn more about these details.
>
> More specifically, I would like to A orthonormalize a block of vectors,
> say for a block size of 4, however I can not form A explicitly because
> then it becomes large and dense. But it can be formed by a matrix vector
> operation. Due this reason, cgs and mgs is a little different for me,
> this is the source of the discussion.
>
> Best,
> Umut
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130715/9d1fa4a6/attachment.pgp>
More information about the petsc-users
mailing list