Sparse Matrix Inversion using PETSc

Hong Zhang hzhang at mcs.anl.gov
Wed Aug 15 09:54:48 CDT 2007


Tim,

As suggested by Aron, you should do following:

1. MatLUFactorSymbolic(A,...,&Fact);
   MatLUFactorNumeric(A,&Fact);

2. for (i=0; i<N; i++){
     MatSolve(Fact,rhs_vecs[i],sol_vecs[i]);
   }

For Cholesky factorization, in which matrix Fact is stored
in petsc SBAIJ format, we support MatSolves(). Thus you can call
   MatSolves(Fact,rhs_vecs,sol_vecs);
   where rhs_vecs and sol_vecs are multivectors.
   See
http://www-unix.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatSolves.html

Petsc multivector Vecs - Collection of vectors where the data for the
vectors is stored in one contiguous memory. It is a
temporary construct for handling multiply right hand side solves.
We like to add support for other type of multivectors though.

See ~petsc/src/mat/examples/tests/ex76.c. Other examples are
available under ~petsc/src/mat/examples/tests/.
Note: petsc only supports sequential Cholesky/LU.
For parallel LU, you must use superlu_dist or mumps.
Simply run the same petsc code with runtime option
'-mat_type superlu_dist' or '-mat_type aijmumps'.
I would recommend start from a petsc example.

Hong

On Wed, 15 Aug 2007, Aron Ahmadia wrote:

> Dear Tim,
>
> It is possible to carry out the explicit inversion of a sparse matrix
> using the PETSc framework with the methodology you outlined below.  I
> would encourage you to consider Cholesky/LU factorizations of the
> matrix, which occassionally result in sparser triangular solve times
> than an explicit inverse-matrix-vector multiply would.
>
> As for the correct way to do this, I would start with the fastest
> methods for multiple right hand sides and reasonably sized matrices, a
> direct method such as LU.  I'm unaware of any functionality in PETSc
> for handling multiple right hand sides, but PETSc will keep the
> factorization from a previous direct solve, so A\b2 will be much
> faster than A\b1.  I think the best bet is a naive for loop over each
> of the vectors to assemble the matrix piece by piece.
>
> The PETSc developers may have some more thoughts on this.
>
> Good luck,
> ~Aron
>
> On 8/15/07, Dr. Timothy Stitt <timothy.stitt at ichec.ie> wrote:
> > Hi all,
> >
> > I am currently investigating the best way to perform the inversion of a
> > large sparse matrix and came upon the idea of using PETSc as a framework
> > for testing various strategies from direct to iterative methods on my
> > sample matrices. In this setup for an NxN sparse matrix A I would have N
> > rhs's representing the Identity matrix and then solve for X. I wanted to
> > experiment with both parallel and serial strategies ranging from LU
> > Decomposition using SuperLU, MUMPS etc. to iterative methods using GMRES
> > etc. Am I right in thinking that all this can be done in PETSc by
> > setting up a core framework and then varying the solver methods etc?
> >
> > I have looked over the sample KSP Solver codes although they only seem
> > to suggest single vectors for x and b. Can this be changed to accept
> > multiple vectors? Can anyone suggest a sample code that maybe
> > demonstrates the sort of thing I want to achieve...if it is in fact
> > possible.
> >
> > Thanks in advance for any assistance given,
> >
> > Regards,
> >
> > Tim.
> >
> >
>
>




More information about the petsc-users mailing list