[petsc-users] Dense Matrix setting local rows / local columns Question

Matthew Knepley knepley at gmail.com
Thu Oct 17 12:22:36 CDT 2013


On Thu, Oct 17, 2013 at 10:02 AM, James A Charles <charlesj at purdue.edu>wrote:

> Hello,
>
> I'm trying to use a Dense Parallel matrix where there are no nonzeros that
> is rectangular of size Nxp where N >> p. P is anywhere from 2-6 typically
> and N can be of size 1E6 or more.
>
> For this I would like to only distribute the rows along the MPI processes
> and not have the columns distributed at all (local columns size = global
> column size). What is the best way to do this in Petsc? If it matters I am
> using Petsc3.4.
>

We do not distribute columns. You could if you used the Elemental
implementation, but you do not need that.


> The operations performed after the Matrix allocation are:
>
> direct call to Lapack for QR factorization via pointer to array.
>

You really want Tall-Skinny QR here (TSQR). We have not implemented it, but
it is not hard, so if
you would like to contribute it, that would be great.


> Matrix multiply with an NxN matrix.
>

This works.

   Matt


> I have this working in serial but when I try to distribute my matrices I
> get columns with size p*(number of processes).
>
> Thanks,
> James
>
>
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131017/53f677d4/attachment.html>


More information about the petsc-users mailing list