[petsc-users] Fast LU solver for block diagonal matrix

Barry Smith bsmith at mcs.anl.gov
Wed Mar 18 15:04:10 CDT 2015


> On Mar 18, 2015, at 2:54 PM, Luc Berger-Vergiat <lb2653 at columbia.edu> wrote:
> 
> Hi Barry,
> I would like to compute S explicitly, I have a few good option to precondition S but they are based on S not an approximation of S (i.e. I don't want to compute my preconditioner using Sp different from S).
> 
> Also Jee is obtained using MatGetSubmatrix(J,isrow_e,iscol_e) and J is an AIJ matrix so I assume that Jee is AIJ too.
> I can convert Jee to BAIJ to take advantage of the block structure but from your conversation with Chung-Kan it might require to reassemble?
> In that case what about the following:
> MatCreateBAIJ(comm, 5, Jee_inv)
> for(i=0,i<nblocks,i++)
> {
>     MatGetValues(Jee,5,[5*i+0,5*i+1,5*i+2,5*i+3,5*i+4],5,[5*i+0,5*i+1,5*i+2,5*i+3,5*i+4], block_values)
>     some_package_inverts( block_values )
>     MatSetValuesBlocked(Jee_inv,5,[5*i+0,5*i+1,5*i+2,5*i+3,5*i+4],5,[5*i+0,5*i+1,5*i+2,5*i+3,5*i+4], block_values)
> }
> MatAssemblyBegin()
> MatAssemblyEnd()
> With this I could then just go on and do matrix multiplications to finish my Schur complement calculations.
> Would this be decently efficient?

   This would be fine. Then you can use MatPtAP or MatRARt to compute the triple matrix product.

  Barry

> Best,
> Luc
> 
> On 03/18/2015 03:22 PM, Barry Smith wrote:
>>   Do you want to explicitly compute (as a matrix) S = Joo - joe * inv(Jee) jeo   or do you want to just have an efficient computation of 
>> S y  for any y vector? 
>> 
>>   Here is some possibly useful information. If you create Jee as a BAIJ matrix of block size 5 and use MatILUFactor() it will efficiently factor this matrix (each 5 by 5 block factorization is done with custom code) then you can use MatSolve() efficiently with the result (note that internally when factoring a BAIJ matrix PETSc actually stores the inverse of the diagonal blocks so in your case the MatSolve() actually ends up doing little matrix-vector products (and there are no triangular solves). 
>> 
>>   To use this with the MatCreateSchurComplement() object you can do
>> 
>>    MatCreateSchurComplement(...,&S)
>>    MatSchurComplementGetKSP(S,&ksp)
>>    KSPSetType(ksp,KSPPREONLY);
>> 
>>    now MatMult(S,y,z) will be efficient. 
>> 
>>    Of course you still have the question, how do you plan to solve S? This depends on its structure and if you have a good way of preconditioning it.
>> 
>>    If you want to explicitly form S you can use MatMatSolve( fact,jeo) but this requires making jeo dense which seems to defeat the purpose.
>> 
>>   Barry
>> 
>> 
>> 
>> 
>>> On Mar 18, 2015, at 1:41 PM, Luc Berger-Vergiat <lb2653 at columbia.edu>
>>>  wrote:
>>> 
>>> Hi all,
>>> I am solving multi-physics problem that leads to a jacobian of the form:
>>> 
>>> [ Jee  Jeo ]
>>> [ Joe  Joo ]
>>> 
>>> where Jee is 5by5 block diagonal. This feature makes it a very good candidate for a Schur complement.
>>> Indeed, Jee could be inverted in parallel with no inter-nodes communication.
>>> My only issue is the fact that the Schur complement is not accessible directly with PETSC, only an approximation is available, for direct solvers (usually S~Joo or S~Joo-Joe* diag(Jee)^-1 *Jeo).
>>> 
>>> Any advice on how I could efficiently compute Jee^-1 for the given structure?
>>> I am currently thinking about hard coding the formula for the inverse of a 5by5 and sweeping through Jee (with some threading) and storing the inverse in-place. Instead of hard coding the formula for a 5by5 I could also do a MatLUFactorSym on a 5by5 matrix but it would not give me an inverse, only a factorization...
>>> 
>>> Thanks in advance for your suggestions!
>>> 
>>> -- 
>>> Best,
>>> Luc
>>> 
>>> 
>>> 
> 



More information about the petsc-users mailing list