On Wed, Jan 25, 2012 at 12:10 PM, Dominik Szczerba <span dir="ltr"><<a href="mailto:dominik@itis.ethz.ch">dominik@itis.ethz.ch</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I recently realized that, independently of my explicitly partitioning<br>
the input mesh, Petsc also employs partitioning somewhere internally.<br></blockquote><div><br></div><div>We don't unless you tell us to.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Is my understanding correct then:<br>
<br>
* parmetis is the default even when other partitioners were configured<br></blockquote><div><br></div><div>Yes.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
* for MPIAIJ matrices the employed partitioner must be parallel, e.g.,<br>
parmetis or ptscotch and not sequential, e.g. chaco or party.<br></blockquote><div><br></div><div>Yes.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks for any clarifications.<br>
<br>
PS. I can not configure petsc with --download-ptscotch on any of my<br>
systems, will send configure.log's soon.<br>
<span class="HOEnZb"><font color="#888888"><br>
Dominik</font></span></blockquote></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<br>