[petsc-users] MatOrdering for rectangular matrix
Marcel Huysegoms
m.huysegoms at fz-juelich.de
Fri Oct 23 05:05:13 CDT 2020
Hi Barry,
many thanks for your explanation and suggestion!! I have a much better
understanding of the problem now.
For some reason, I wasn't aware that permuting A by P leads to a
/symmetric/ reordering of A'A.
I searched for the paper by Tim Davis that describes their reordering
approach ("SuiteSparseQR: multifrontal mulithreaded rank-revealing
sparse QR factorization"), and as you expected, they perform the column
ordering of A by using a permutation matrix P which is obtained by an
ordering of A'A. However, they are using the reordered matrix AP to
perform a QR decomposition, not to use it for a preconditioner as I
intend to do.
All in all, I will definitely try your suggested approach that
SuiteSparseQR more or less also utilizes.
However, I have (more or less) _one remaining question_:
When calculating a column reordering matrix P based on A'A and applying
this matrix to A (so having AP), then its normal equation will be
P'(A'A)P as you pointed out. But P has originally been computed in a
way, so that (A'A)P will be diagonally dominant, not P'(A'A)P. So won't
the additional effect of P' (i.e. the row reordering) compromise the
diagonal structure again?
I am using the KSP in the following way:
ksp = PETSc.KSP().create(PETSc.COMM_WORLD)
ksp.setType("lsqr")
pc = ksp.getPC()
pc.setType("bjacobi")
ksp.setOperators(A, A'A)
ksp.solve(b, x)
The paper you referenced seems very intersting to me. So I wonder, if I
had a good /non-symmetric/ ordering of A'A, i.e. Q(A'A)P, and would pass
this matrix to setOperators() as the second argument for the
preconditioner (while using AP as first argument), what is happening
internally? Does BJACOBI compute a preconditioner matrix M^(-1) for
Q(A'A)P and passes this M^(-1) to LSQR for applying it to AP [yielding
M^(-1)AP] before performing its iterative CG-method on this
preconditioned system? In that case, could I perform the computation of
M^(-1) outside of ksp.solve(), so that I could apply it myself to AP
_and_ b (!!), so passing M^(-1)AP and M^(-1)b to ksp.setOperators() and
ksp.solve()?
Maybe my question is due to one missing piece of mathematical
understanding. Does the matrix for computing the preconditioning (second
argument to setOperators()) have to be exactly the normal equation (A'A)
of the first argument in order to mathematically make sense? I could not
find any reference why this is done/works?
Thank you very much in advance for taking time for this topic! I really
appreciate it.
Marcel
Am 22.10.20 um 16:34 schrieb Barry Smith:
> Marcel,
>
> Would you like to do the following? Compute
>
> Q A P where Q is a row permutation, P a column permutation and
> then apply LSQR on QAP?
>
> From the manual page:
>
> In exact arithmetic the LSQR method (with no preconditioning) is
> identical to the KSPCG algorithm applied to the normal equations.
>
> [Q A P]' [Q A P] = P' A' A P = P'(A'A) P the Q drops out because
> permutation matrices' transposes are their inverse
>
> Note that P is a small square matrix.
>
> So my conclusion is that any column permutation of A is also a
> symmetric permutation of A'A so you can just try using regular
> reorderings of A'A if
> you want to "concentrate" the "important" parts of A'A into your
> "block diagonal" preconditioner (and throw away the other parts)
>
> I don't know what it will do to the convergence. I've never had much
> luck generically trying to symmetrically reorder matrices to improve
> preconditioners but
> for certain situation maybe it might help. For example if the matrix
> is [0 1; 1 0] and you permute it you get the [1 0; 0 1] which looks
> better.
>
> There is this https://epubs.siam.org/doi/10.1137/S1064827599361308
> but it is for non-symmetric permutations and in your case if you use a
> non symmetric permeation you can no longer use LSQR.
>
> Barry
>
>
>
>
>> On Oct 22, 2020, at 4:55 AM, Matthew Knepley <knepley at gmail.com
>> <mailto:knepley at gmail.com>> wrote:
>>
>> On Thu, Oct 22, 2020 at 4:24 AM Marcel Huysegoms
>> <m.huysegoms at fz-juelich.de <mailto:m.huysegoms at fz-juelich.de>> wrote:
>>
>> Hi all,
>>
>> I'm currently implementing a Gauss-Newton approach for minimizing a
>> non-linear cost function using PETSc4py.
>> The (rectangular) linear systems I am trying to solve have
>> dimensions of
>> about (5N, N), where N is in the range of several hundred millions.
>>
>> Due to its size and because it's an over-determined system, I use
>> LSQR
>> in conjunction with a preconditioner (which operates on A^T x A, e.g.
>> BJacobi).
>> Depending on the ordering of the unknowns the algorithm only
>> converges
>> for special cases. When I use a direct LR solver (as
>> preconditioner) it
>> consistently converges, but consumes too much memory. I have read
>> in the
>> manual that the LR solver internally also applies a matrix reordering
>> beforehand.
>>
>> My question would be:
>> How can I improve the ordering of the unknowns for a rectangular
>> matrix
>> (in order to converge also with iterative preconditioners)? If I use
>> MatGetOrdering(), it only works for square matrices. Is there a
>> way to
>> achieve this from within PETSc4py?
>> ParMETIS seems to be a promising framework for that task. Is it
>> possible
>> to apply its reordering algorithm to a rectangular PETSc-matrix?
>>
>> I would be thankful for every bit of advice that might help.
>>
>>
>> We do not have any rectangular reordering algorithms. I think your
>> first step is to
>> find something in the literature that you think will work.
>>
>> Thanks,
>>
>> Matt
>>
>> Best regards,
>> Marcel
>>
>>
>> ------------------------------------------------------------------------------------------------
>> ------------------------------------------------------------------------------------------------
>> Forschungszentrum Juelich GmbH
>> 52425 Juelich
>> Sitz der Gesellschaft: Juelich
>> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
>> Vorsitzender des Aufsichtsrats: MinDir Volker Rieke
>> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
>> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt
>> ------------------------------------------------------------------------------------------------
>> ------------------------------------------------------------------------------------------------
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which
>> their experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201023/fb9e2cbb/attachment-0001.html>
More information about the petsc-users
mailing list