<div dir="ltr"><div><div><div><div><div><div>Please check out the manual page for MatSetSizes()<br> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html</a><br>
<br></div>Basically you have two choices:<br></div><br>1/ Define the global size of the matrix and use PETSC_DECIDE for the local sizes.<br></div>In this case, PETSc will define the local row size in a manner such that there are approximately the same number of rows on each process.<br>
</div><br>2/ Define the local sizes yourself and use PETSC_DETERMINE for the global size. <br>Then you have full control over the parallel layout.<br><br></div>The following functions described by these pages<br><br> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSize.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSize.html</a><br>
<a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetLocalSize.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetLocalSize.html</a><br> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html</a><br>
</div> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRangesColumn.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRangesColumn.html</a><br><br>
might also be useful for you in double checking what the matrix decomposition looks like<br><div><div><br><div><div><div><div><br></div><div>Cheers,<br></div><div> Dave<br></div><div><br></div></div></div></div></div></div>
</div><div class="gmail_extra"><br><br><div class="gmail_quote">On 8 January 2014 12:26, mary sweat <span dir="ltr"><<a href="mailto:mary.sweat78@yahoo.it" target="_blank">mary.sweat78@yahoo.it</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-size:12pt;font-family:HelveticaNeue,Helvetica Neue,Helvetica,Arial,Lucida Grande,sans-serif"><div>
My target is the following. I got a huge linear system with a sparse huge matrix, nothing to deal with PDE. How is the system splitted between processes? is there in this suggested book the answer?</div><div>Thanks again</div>
<div style="display:block"> <br> <br> <div style="font-family:HelveticaNeue,'Helvetica Neue',Helvetica,Arial,'Lucida Grande',sans-serif;font-size:12pt"> <div style="font-family:HelveticaNeue,'Helvetica Neue',Helvetica,Arial,'Lucida Grande',sans-serif;font-size:12pt">
<div dir="ltr"> <font face="Arial"> Il Marted́ 7 Gennaio 2014 17:34, Jed Brown <<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>> ha scritto:<br> </font> </div><div><div class="h5"> <div>
mary sweat <<a shape="rect" href="mailto:mary.sweat78@yahoo.it" target="_blank">mary.sweat78@yahoo.it</a>> writes:<div><br clear="none"><br clear="none">> Hi all, I need to know how does KSP separate and distribute domain<br clear="none">
> between processes and the way processes share and communicate halfway<br clear="none">> results. Is there any good documentation about it???</div><br clear="none"><br clear="none">The communication is in Mat and Vec functions. You can see it<br clear="none">
summarized in -log_summary. For the underlying theory, see Barry's<br clear="none">book.<br clear="none"><br clear="none"><a shape="rect" href="http://www.mcs.anl.gov/~bsmith/ddbook.html" target="_blank">http://www.mcs.anl.gov/~bsmith/ddbook.html</a><br>
<br></div> </div></div></div> </div> </div> </div></div></blockquote></div><br></div>