[petsc-users] Computing part of the inverse of a large matrix

Matthew Knepley knepley at gmail.com
Tue Oct 1 03:09:25 CDT 2019


On Tue, Oct 1, 2019 at 4:07 AM Jan Grießer <griesser.jan at googlemail.com>
wrote:

> Hey Matt,
> Can you elaborate a little bit on your idea for calculating the inverse
> matrix ?
>

It is exactly what you were doing before, except you use KSP with

  -ksp_type preonly -pc_type lu -pc_mat_solver_package mumps

and then MatMatSolve on the identity matrix.

  Thanks,

     Matt


> Greetings Jan
>
> Am Mo., 30. Sept. 2019 um 17:50 Uhr schrieb Matthew Knepley <
> knepley at gmail.com>:
>
>> I think the easier way to do it is to use a KSP which is configured to do
>> preonly and LU. That will do the right thing in parallel.
>>
>>    Matt
>>
>> On Mon, Sep 30, 2019 at 11:47 AM Smith, Barry F. via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>>
>>>
>>>    The Python wrapper for PETSc may be missing some functionality; there
>>> is a manual process involved in creating new ones. You could poke around
>>> the petsc4py source and see how easy it would be to add more functionality
>>> that you need.
>>>
>>>
>>>
>>> > On Sep 30, 2019, at 10:13 AM, Jan Grießer <griesser.jan at googlemail.com>
>>> wrote:
>>> >
>>> > I configured PETSc with MUMPS and tested it already for the spectrum
>>> slicing method in Slepc4py but i have problems in setting up the LU
>>> factorization in the PETSc4py. Since i do not find the corresponding
>>> methods and commands in the source code. Thats why is was wondering if this
>>> is even possible in the python version.
>>> >
>>> > Am Mo., 30. Sept. 2019 um 16:57 Uhr schrieb Smith, Barry F. <
>>> bsmith at mcs.anl.gov>:
>>> >
>>> >   If you want a parallal LU (and hence the ability to build the
>>> inverse in parallel) you need to configure PETSc with --download-mumps
>>> --download-scalapack
>>> >
>>> >   Barry
>>> >
>>> >
>>> > > On Sep 30, 2019, at 9:44 AM, Jan Grießer <
>>> griesser.jan at googlemail.com> wrote:
>>> > >
>>> > > Is the MatMumpsGetInverse also wrapped to the python version in
>>> PETSc4py ? If yes is there any example for using it ?
>>> > > My other question is related to the LU factoriation (
>>> https://www.mcs.anl.gov/petsc/documentation/faq.html#invertmatrix).
>>> > > Is the LU factorization only possible for sequential Aij matrices ?
>>> I read in the docs that this is the case for ordering.
>>> > > After setting up my matrix A, B and x i tried:
>>> > >  r, c = dynamical_matrix_nn.getOrdering("nd")
>>> > >  fac_dyn_matrix = dynamical_matrix_nn.factorLU(r,c)
>>> > >
>>> > > resulting in an error:
>>> > > [0] No support for this operation for this object type
>>> > > [0] Mat type mpiaij
>>> > >
>>> > > Am Fr., 27. Sept. 2019 um 16:26 Uhr schrieb Zhang, Hong <
>>> hzhang at mcs.anl.gov>:
>>> > > See ~petsc/src/mat/examples/tests/ex214.c on how to compute selected
>>> entries of inv(A) using mumps.
>>> > > Hong
>>> > >
>>> > > On Fri, Sep 27, 2019 at 8:04 AM Smith, Barry F. via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> > >
>>> > > MatMumpsGetInverse() maybe useful. Also simply using MatMatSolve()
>>> with the first 1000 columns of the identity and "throwing away" the part
>>> you don't need may be most effective.
>>> > >
>>> > >    Barry
>>> > >
>>> > >
>>> > >
>>> > > > On Sep 27, 2019, at 3:34 AM, Jan Grießer via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> > > >
>>> > > > Hi all,
>>> > > > i am using petsc4py. I am dealing with rather large sparse
>>> matrices up to 600kx600k and i am interested in calculating a part of the
>>> inverse of the matrix(I know it will be a dense matrix). Due to the nature
>>> of my problem, I am only interested in approximately the first 1000 rows
>>> and 1000 columns (i.e. a large block in the upper left ofthe matrix).
>>> Before I start to play around now, I wanted to ask if there is a clever way
>>> to tackle this kind of problem in PETSc in principle. For any input I would
>>> be very grateful!
>>> > > > Greetings Jan
>>> > >
>>> >
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191001/1bcd7636/attachment-0001.html>


More information about the petsc-users mailing list