<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Oct 23, 2020, at 5:05 AM, Marcel Huysegoms <<a href="mailto:m.huysegoms@fz-juelich.de" class="">m.huysegoms@fz-juelich.de</a>> wrote:</div><br class="Apple-interchange-newline"><div class="">
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252" class="">
<div class="">
Hi Barry,<br class="">
<br class="">
many thanks for your explanation and suggestion!! I have a much
better understanding of the problem now.<br class="">
<br class="">
For some reason, I wasn't aware that permuting A by P leads to a <i class="">symmetric</i>
reordering of A'A.<br class="">
I searched for the paper by Tim Davis that describes their
reordering approach ("SuiteSparseQR: multifrontal mulithreaded
rank-revealing sparse QR factorization"), and as you expected, they
perform the column ordering of A by using a permutation matrix P
which is obtained by an ordering of A'A. However, they are using the
reordered matrix AP to perform a QR decomposition, not to use it for
a preconditioner as I intend to do.<br class="">
<br class="">
All in all, I will definitely try your suggested approach that
SuiteSparseQR more or less also utilizes.<br class="">
<br class="">
However, I have (more or less) <u class="">one remaining question</u>:<br class="">
<br class="">
When calculating a column reordering matrix P based on A'A and
applying this matrix to A (so having AP), then its normal equation
will be P'(A'A)P as you pointed out. But P has originally been
computed in a way, so that (A'A)P will be diagonally dominant, not
P'(A'A)P. So won't the additional effect of P' (i.e. the row
reordering) compromise the diagonal structure again?<br class=""></div></div></blockquote><div><br class=""></div> I don't know anything about this. My feeling was that since A'A is always symmetric one would use a symmetric reordering on it, not a one sided non-symmetric reordering.</div><div><br class=""></div><div> The RCM order has a reputation for bringing off diagonal arguments closer to the diagonal. Hence if you reorder with RCM and then use block Jacobi, in theory, there</div><div>will be "better" blocks on the diagonal then in the original ordering. I would try that first.</div><div><br class=""></div><div><br class=""><blockquote type="cite" class=""><div class=""><div class="">
<br class="">
I am using the KSP in the following way:<br class="">
<pre style="background-color: rgb(255, 255, 255); font-family: Menlo;" class="">ksp = PETSc.KSP().create(PETSc.COMM_WORLD)
ksp.setType(<span style="color:#008080;font-weight:bold;" class="">"lsqr"</span>)
pc = ksp.getPC()
pc.setType(<span style="color:#008080;font-weight:bold;" class="">"bjacobi"</span>)
ksp.setOperators(A, A'A)
ksp.solve(b, x)</pre>
The paper you referenced seems very intersting to me. So I wonder,
if I had a good <i class="">non-symmetric</i> ordering of A'A, i.e. Q(A'A)P,
and would pass this matrix to setOperators() as the second argument
for the preconditioner (while using AP as first argument), what is
happening internally? Does BJACOBI compute a preconditioner matrix
M^(-1) for Q(A'A)P and passes this M^(-1) to LSQR for applying it to
AP [yielding M^(-1)AP] before performing its iterative CG-method on
this preconditioned system? In that case, could I perform the
computation of M^(-1) outside of ksp.solve(), so that I could apply
it myself to AP <u class="">and</u> b (!!), so passing M^(-1)AP and M^(-1)b
to ksp.setOperators() and ksp.solve()?<br class="">
<br class="">
Maybe my question is due to one missing piece of mathematical
understanding. Does the matrix for computing the preconditioning
(second argument to setOperators()) have to be exactly the normal
equation (A'A) of the first argument in order to mathematically make
sense? I could not find any reference why this is done/works?<br class=""></div></div></blockquote><div><br class=""></div> No, you can pass any matrix you want as the "normal equation" matrix to LSQR because it only builds the preconditioner from it. The matrix-vector products that define the problem are passed as the other argument. Heuristically you want something B for A'A that is "close" to A'A in some measure. The simplest thing would be just remove some terms away from the diagonal in B. What terms to move etc is unknown to me. There are many games one can play but I don't know which ones would be good for your problem.</div><div><br class=""></div><div> Barry</div><div><br class=""><blockquote type="cite" class=""><div class=""><div class="">
<br class="">
Thank you very much in advance for taking time for this topic! I
really appreciate it.<br class="">
<br class="">
Marcel<br class="">
<br class="">
<br class="">
<div class="moz-cite-prefix">Am 22.10.20 um 16:34 schrieb Barry
Smith:<br class="">
</div>
<blockquote type="cite" cite="mid:9B822030-7E72-4C6A-9669-6AA82AFB0B95@petsc.dev" class="">
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252" class="">
<div class=""> Marcel,</div>
<div class=""><br class="">
</div>
Would you like to do the following? Compute
<div class=""><br class="">
</div>
<div class=""> Q A P where Q is a row permutation, P a column
permutation and then apply LSQR on QAP? </div>
<div class=""> </div>
<div class=""><br class="">
</div>
<div class=""> From the manual page: </div>
<div class=""><br class="">
</div>
<div class="">In exact arithmetic the LSQR method (with no
preconditioning) is identical to the KSPCG algorithm applied to
the normal equations.</div>
<div class=""><br class="">
</div>
<div class=""> [Q A P]' [Q A P] = P' A' A P = P'(A'A) P the Q
drops out because permutation matrices' transposes are their
inverse</div>
<div class=""><br class="">
</div>
<div class=""> Note that P is a small square matrix. </div>
<div class=""><br class="">
</div>
<div class=""> So my conclusion is that any column permutation of
A is also a symmetric permutation of A'A so you can just try
using regular reorderings of A'A if </div>
<div class="">you want to "concentrate" the "important" parts of
A'A into your "block diagonal" preconditioner (and throw away
the other parts)</div>
<div class=""><br class="">
</div>
<div class=""> I don't know what it will do to the convergence.
I've never had much luck generically trying to symmetrically
reorder matrices to improve preconditioners but </div>
<div class="">for certain situation maybe it might help. For
example if the matrix is [0 1; 1 0] and you permute it you get
the [1 0; 0 1] which looks better.</div>
<div class=""><br class="">
</div>
<div class=""> There is this <a href="https://epubs.siam.org/doi/10.1137/S1064827599361308" class="" moz-do-not-send="true">https://epubs.siam.org/doi/10.1137/S1064827599361308</a>
but it is for non-symmetric permutations and in your case if you
use a non symmetric permeation you can no longer use LSQR. </div>
<div class=""><br class="">
</div>
<div class=""> Barry</div>
<div class=""><br class="">
</div>
<div class=""><br class="">
</div>
<div class="">
<div class=""><br class="">
</div>
<div class=""><br class="">
<blockquote type="cite" class="">
<div class="">On Oct 22, 2020, at 4:55 AM, Matthew Knepley
<<a href="mailto:knepley@gmail.com" class="" moz-do-not-send="true">knepley@gmail.com</a>> wrote:</div>
<br class="Apple-interchange-newline">
<div class="">
<div dir="ltr" class="">
<div dir="ltr" class="">On Thu, Oct 22, 2020 at 4:24 AM
Marcel Huysegoms <<a href="mailto:m.huysegoms@fz-juelich.de" class="" moz-do-not-send="true">m.huysegoms@fz-juelich.de</a>>
wrote:<br class="">
</div>
<div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0px 0px
0px 0.8ex;border-left:1px solid
rgb(204,204,204);padding-left:1ex">Hi all,<br class="">
<br class="">
I'm currently implementing a Gauss-Newton approach
for minimizing a<br class="">
non-linear cost function using PETSc4py.<br class="">
The (rectangular) linear systems I am trying to
solve have dimensions of<br class="">
about (5N, N), where N is in the range of several
hundred millions.<br class="">
<br class="">
Due to its size and because it's an over-determined
system, I use LSQR<br class="">
in conjunction with a preconditioner (which operates
on A^T x A, e.g.<br class="">
BJacobi).<br class="">
Depending on the ordering of the unknowns the
algorithm only converges<br class="">
for special cases. When I use a direct LR solver (as
preconditioner) it<br class="">
consistently converges, but consumes too much
memory. I have read in the<br class="">
manual that the LR solver internally also applies a
matrix reordering<br class="">
beforehand.<br class="">
<br class="">
My question would be:<br class="">
How can I improve the ordering of the unknowns for a
rectangular matrix<br class="">
(in order to converge also with iterative
preconditioners)? If I use<br class="">
MatGetOrdering(), it only works for square matrices.
Is there a way to<br class="">
achieve this from within PETSc4py?<br class="">
ParMETIS seems to be a promising framework for that
task. Is it possible<br class="">
to apply its reordering algorithm to a rectangular
PETSc-matrix?<br class="">
<br class="">
I would be thankful for every bit of advice that
might help.<br class="">
</blockquote>
<div class=""><br class="">
</div>
<div class="">We do not have any rectangular
reordering algorithms. I think your first step is to</div>
<div class="">find something in the literature that
you think will work.</div>
<div class=""><br class="">
</div>
<div class=""> Thanks,</div>
<div class=""><br class="">
</div>
<div class=""> Matt</div>
<div class=""> </div>
<blockquote class="gmail_quote" style="margin:0px 0px
0px 0.8ex;border-left:1px solid
rgb(204,204,204);padding-left:1ex">
Best regards,<br class="">
Marcel<br class="">
<br class="">
<br class="">
------------------------------------------------------------------------------------------------<br class="">
------------------------------------------------------------------------------------------------<br class="">
Forschungszentrum Juelich GmbH<br class="">
52425 Juelich<br class="">
Sitz der Gesellschaft: Juelich<br class="">
Eingetragen im Handelsregister des Amtsgerichts
Dueren Nr. HR B 3498<br class="">
Vorsitzender des Aufsichtsrats: MinDir Volker Rieke<br class="">
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang
Marquardt (Vorsitzender),<br class="">
Karsten Beneke (stellv. Vorsitzender), Prof.
Dr.-Ing. Harald Bolt<br class="">
------------------------------------------------------------------------------------------------<br class="">
------------------------------------------------------------------------------------------------<br class="">
<br class="">
</blockquote>
</div>
<br class="" clear="all">
<div class=""><br class="">
</div>
-- <br class="">
<div dir="ltr" class="gmail_signature">
<div dir="ltr" class="">
<div class="">
<div dir="ltr" class="">
<div class="">
<div dir="ltr" class="">
<div class="">What most experimenters take
for granted before they begin their
experiments is infinitely more interesting
than any results to which their
experiments lead.<br class="">
-- Norbert Wiener</div>
<div class=""><br class="">
</div>
<div class=""><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank" class="" moz-do-not-send="true">https://www.cse.buffalo.edu/~knepley/</a><br class="">
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</blockquote>
</div>
<br class="">
</div>
</blockquote>
<br class="">
</div>
</div></blockquote></div><br class=""></body></html>