[petsc-users] Pseudoinverse of a large matrix

Jack Poulson jack.poulson at gmail.com
Mon Dec 19 13:50:28 CST 2011


Chetan,

I completely agree that (rank-revealing) QR should be the first choice. As
a side note, from what he has said, his matrix is actually dense.

If his matrix was sparse, I am not sure how much sparsity would be lost by
the column pivoting inherent in a rank-revealing QR. I know that the MUMPS
group is either working on or has finished a sparse QR, but I don't know
any details about their approach to pivoting (though I would be very
interested!). Hopefully it could simply reuse the threshold approach used
for sparse LU and LDL.

Jack

On Mon, Dec 19, 2011 at 1:38 PM, Chetan Jhurani <chetan.jhurani at gmail.com>wrote:

> > It can be further optimized using the Woodbury identity for two cases –*
> ***
>
> > rank <= size or rank >= size to reduce the size of auxiliary internal
> Cholesky solve.****
>
> ** **
>
> Sorry, that’s wrong.  I meant rank <= size/2 or rank >= size/2.  Size here
> ****
>
> refers to square matrix size, but this analysis can be done for rectangular
> ****
>
> case too.****
>
> ** **
>
> Chetan****
>
> ** **
>
> *From:* Chetan Jhurani [mailto:chetan.jhurani at gmail.com]
> *Sent:* Monday, December 19, 2011 11:35 AM
> *To:* 'PETSc users list'
> *Subject:* RE: [petsc-users] Pseudoinverse of a large matrix****
>
> ** **
>
> ** **
>
> Couple of optimizations not there currently.****
>
> ** **
>
> **1.   **It can be changed to use sparse RRQR factorization for sparse
> input matrix.****
>
> **2.   **It can be further optimized using the Woodbury identity for two
> cases –****
>
> rank <= size or rank >= size to reduce the size of auxiliary internal
> Cholesky solve.****
>
> ** **
>
> Chetan****
>
> ** **
>
> ** **
>
> *From:* petsc-users-bounces at mcs.anl.gov [
> mailto:petsc-users-bounces at mcs.anl.gov <petsc-users-bounces at mcs.anl.gov>]
> *On Behalf Of *Modhurita Mitra
> *Sent:* Monday, December 19, 2011 11:01 AM
> *To:* PETSc users list
> *Subject:* Re: [petsc-users] Pseudoinverse of a large matrix****
>
> ** **
>
> I am trying to express the radiation pattern of an antenna in terms of
> spherical harmonic basis functions. I have radiation pattern data at
> N=324360 points. Therefore, my matrix is spherical harmonic basis functions
> evaluated till order sqrt(N) (which makes up at total of N elements),
> evaluated at N data points. So this is a linear least squares problem, and
> I have been trying to solve it by finding its pseudoinverse which uses SVD.
> The elements of the matrix are complex, and the matrix is non-sparse. I
> have solved it in MATLAB using a subsample of the data, but MATLAB runs out
> of memory while calculating the SVD at input matrix size 2500 X 2500. I
> need to solve this problem using the entire data.
>
>
> I was thinking of using SLEPc because I found a ready-to-use code which
> computes the SVD of a complex-valued matrix (
> http://www.grycap.upv.es/slepc/documentation/current/src/svd/examples/tutorials/ex14.c.html). I don't know how to use either PETSc or SLEPc (or Elemental) yet, so I
> am trying to figure out where to start and what I should learn.
>
> Thanks,
> Modhurita****
>
> On Mon, Dec 19, 2011 at 12:31 PM, Matthew Knepley <knepley at gmail.com>
> wrote:****
>
> On Mon, Dec 19, 2011 at 12:21 PM, Modhurita Mitra <modhurita at gmail.com>
> wrote:****
>
> Hi,
>
> I have to compute the pseudoinverse of a 324360 X 324360 matrix. Can PETSc
> compute the SVD of this matrix without parallelization? If parallelization
> is needed, do I need to use SLEPc?****
>
> ** **
>
> With enough memory, yes. However, I am not sure you want to wait. I am not
> sure how SLEPc would help here.****
>
> From the very very little detail you have given, you would need parallel
> linear algebra, like Elemental. However,****
>
> I would start out from a more fundamental viewpoint. Such as replacing
> "compute the psuedoinverse" with****
>
> "solve a least-squares problem" if that is indeed the case.****
>
> ** **
>
>    Matt****
>
>  ****
>
>
> Thanks,
> Modhurita****
>
>
>
> ****
>
> ** **
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener****
>
> ** **
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111219/bd921e5e/attachment-0001.htm>


More information about the petsc-users mailing list