[petsc-users] Assembling primal Schur matrix in FETI-DP method
Jed Brown
jedbrown at mcs.anl.gov
Mon Nov 21 09:48:47 CST 2011
On Mon, Nov 21, 2011 at 09:41, Thomas Witkowski <
Thomas.Witkowski at tu-dresden.de> wrote:
> In my case the Schur complemt should be quite sparse,
>
So semantically, your Kbb is a parallel block-diagonal matrix. In my
opinion, you don't actually want to store it that way because then you are
only allowed to solve with the whole thing, which would make the algorithm
more synchronous than necessary. So I would store each block in its own
matrix with its own local communicator (MPI_COMM_SELF, of if you are being
more general, some suitable subcommunicator).
> so I want to build it explicitly. My main problem is still how to compute
>
> inverse(Kbb) * Kba
>
> Sorry for asking again, but no of the solutions seems to be sastisfying.
> When I understood you (and Jed) right, there are two general ways: either I
> define inverse(Kbb) either as a Mat object and use MatMatSolve or via KSP
> and using KSPSolve. The first option seems fine, but one of you noted that
> it is not possible to reuse the LU factorization.
>
No, both ways reuse the LU factorization.
> The would be a huge drawback as I have to use inverse(Kbb) in different
> context. When defining inverse(Kbb) via KSP, as I do it at the moment (and
> yes, I want to use here direct solvers only), I must store Kba either
> column wise or in a dense way. Both is not really feasible.
>
You extract the piece of Kba that is relevant to each piece of Kbb. This
will have only a few columns and is naturally stored columnwise (either as
an array of column vectors or as MATDENSE).
After solving these blocks, you will have another tall skinny matrix
(either as Vecs or MATDENSE) corresponding to each block of Kbb. Now you
multiply the appropriate blocks Kab and put the (sparse, low dimension per
block) result back into a global sparse matrix (for the coarse problem).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111121/5630cdb5/attachment-0001.htm>
More information about the petsc-users
mailing list