<div dir="ltr"><div class="gmail_default" style="font-family:verdana,sans-serif">So Barry does it mean that currently PETSc does not support a parallel implementation of Dense Matrices. If it does could you please provide me a link where could I find a proper documentation for the same.<br></div><div class="gmail_default" style="font-family:verdana,sans-serif">Thanks,<br></div><div class="gmail_default" style="font-family:verdana,sans-serif">Kaushik<br></div><div class="gmail_default" style="font-family:verdana,sans-serif"><span class="im"></span></div><div class="gmail_extra"><br><div class="gmail_quote">On Fri, Feb 12, 2016 at 2:46 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class=""><br>
> On Feb 12, 2016, at 3:10 AM, Kaushik Kulkarni <<a href="mailto:kaushikggg@gmail.com">kaushikggg@gmail.com</a>> wrote:<br>
><br>
> Thanks Barry,<br>
> Just one more doubt, does it mean that PETSc divides the global matrix among various processes bases on the rows, and no "ACTUAL" division of columns occur?<br>
<br>
</span> For the PETSc matrices yes. But when it uses external packages such as Elemental that is not the case. Depending on what you are doing with the dense matrices it may be better for you to use the MATELEMENTAL <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATELEMENTAL.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATELEMENTAL.html</a> format.<br>
<span class="HOEnZb"><font color="#888888"><br>
<br>
Barry<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
><br>
> Kaushik<br>
> On Fri, Feb 12, 2016 at 2:28 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
> > On Feb 12, 2016, at 1:51 AM, Kaushik Kulkarni <<a href="mailto:kaushikggg@gmail.com">kaushikggg@gmail.com</a>> wrote:<br>
> ><br>
> > Hi all,<br>
> ><br>
> > Could you help me with my doubts:<br>
> ><br>
> > Doubt 1: Initially I tried to create a matrix with MatCreateMPIDense(); I received a compilation error stating that no such function existed.<br>
><br>
> The name was changed<br>
> ><br>
> > Doubt 2: So I continued working with MatCreateDense(). And I set the global size to 10 cross 10. Now when I called the function MatGetLocalSize(A,&localrow,&localcolumn), and ran the code with 2 processes the values returned were:<br>
> > The local matrix size for the process 1 is 5 cross 5<br>
> > The local matrix size for the process 2 is 5 cross 5<br>
> > How can it be possible that process 1 is only dealing with 25 elements and process two is dealing with 25 elements, while the global matrix contains 100 elements.<br>
><br>
> The local size for columns is slightly mis-leading. For standard PETSc matrices such as "Dense" each process stores all the entries for its rows of the matrix. The term "local columns" refers to the rows of the vector which one can use to do a matrix vector product with the matrix. See the users manual for more details on the layout of vectors and matrices in PETSc.<br>
><br>
> Barry<br>
><br>
> ><br>
> > Thanks,<br>
> > Kaushik<br>
><br>
><br>
<br>
</div></div></blockquote></div><br></div></div>