[petsc-users] Doubts related MatCreateDense()

Barry Smith bsmith at mcs.anl.gov
Fri Feb 12 09:09:16 CST 2016


> On Feb 12, 2016, at 8:56 AM, Kaushik Kulkarni <kaushikggg at gmail.com> wrote:
> 
> So Barry does it mean that currently PETSc does not support a parallel implementation of Dense Matrices.

   No I did not say that. I said that the PETSc Dense matrix format does not divide columns across processes. Each process has certain rows of the matrix. 

   The MATELEMENTAL format does a so-called cyclic decomposition of the dense matrix which is good for parallel algorithms such as dense LU factorization. This format is unfortunately more cumbersome to work with.

  So depending on what you want you want to do with your matrix you would pick one of the two formats. 

  If, in fact, you do not have a matrix, but merely a two (or more) dimensional array (that represents, for example, unknowns on a finite difference mesh, sometimes called field variables) then you should not be using Mat at all (since that is for linear operators, not arrays) and you should be using Vec's for your field variables. The DMDA routines (for example DMDACreate2d()) are useful utility routines for creating Vec's that you want to treat as multidimensional arrays.

  Barry



> If it does could you please provide me a link where could I find a proper documentation for the same.
> Thanks,
> Kaushik
> 
> On Fri, Feb 12, 2016 at 2:46 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
> > On Feb 12, 2016, at 3:10 AM, Kaushik Kulkarni <kaushikggg at gmail.com> wrote:
> >
> > Thanks Barry,
> > Just one more doubt, does it mean that PETSc divides the global matrix among various processes bases on the rows, and no "ACTUAL" division of ​columns​ occur​?​
> 
>   For the PETSc matrices yes. But when it uses external packages such as Elemental that is not the case. Depending on what you are doing with the dense matrices it may be better for you to use the MATELEMENTAL http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATELEMENTAL.html format.
> 
> 
> Barry
> 
> >
> > Kaushik
> > On Fri, Feb 12, 2016 at 2:28 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > > On Feb 12, 2016, at 1:51 AM, Kaushik Kulkarni <kaushikggg at gmail.com> wrote:
> > >
> > > Hi all,
> > >
> > > Could you help me with my doubts:
> > >
> > > Doubt 1: Initially I tried to create a matrix with MatCreateMPIDense(); I received a compilation error stating that no such function existed.
> >
> >   The name was changed
> > >
> > > Doubt 2: So I continued working with MatCreateDense(). And I set the global size to 10 cross 10. Now when I called the function MatGetLocalSize(A,&localrow,&localcolumn), and ran the code with 2 processes the values returned were:
> > > The local matrix size for the process 1 is 5 cross 5
> > > The local matrix size for the process 2 is 5 cross 5
> > > How can it be possible that process 1 is only dealing with 25 elements and process two is dealing with 25 elements, while the global matrix contains 100 elements.
> >
> >   The local size for columns is slightly mis-leading. For standard PETSc matrices such as "Dense" each process stores all the entries for its rows of the matrix. The term "local columns" refers to the rows of the vector which one can use to do a matrix vector product with the matrix. See the users manual for more details on the layout of vectors and matrices in PETSc.
> >
> >   Barry
> >
> > >
> > > Thanks,
> > > Kaushik
> >
> >
> 
> 



More information about the petsc-users mailing list