[petsc-users] Dense Matrix setting local rows / local columns Question

Barry Smith bsmith at mcs.anl.gov
Thu Oct 17 10:22:52 CDT 2013


  James,

   PETSc always distributes its MPIDense matrices this way.   The "local column" size doesn't mean which columns values are stored local. Yes the terminology we use is a bit confusing.  Just let PETSc pick the "number of local columns" in the matrix while you provide the global number of columns. In the actual storage of the dense matrix the "number of local columns" is ignored.

    Barry

On Oct 17, 2013, at 10:02 AM, James A Charles <charlesj at purdue.edu> wrote:

> Hello,
> 
> I'm trying to use a Dense Parallel matrix where there are no nonzeros that is rectangular of size Nxp where N >> p. P is anywhere from 2-6 typically and N can be of size 1E6 or more. 
> 
> For this I would like to only distribute the rows along the MPI processes and not have the columns distributed at all (local columns size = global column size). What is the best way to do this in Petsc? If it matters I am using Petsc3.4. 
> 
> The operations performed after the Matrix allocation are:
> 
> direct call to Lapack for QR factorization via pointer to array. 
> 
> Matrix multiply with an NxN matrix. 
> 
> I have this working in serial but when I try to distribute my matrices I get columns with size p*(number of processes). 
> 
> Thanks,
> James
> 
> 
> 
> 
> 
> 



More information about the petsc-users mailing list