On Fri, Nov 11, 2011 at 2:26 PM, TAY wee-beng <span dir="ltr"><<a href="mailto:zonexo@gmail.com">zonexo@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi,<br>
<br>
I am currently using PETSc for my Fortran CFD code and I am manually partitioning my Cartesian grids. So in 2D, it will be something like u(1:size_x,jstart:jend), where the y component is partitioned into 2,4,8 etc parts depending on the no. of processors.<br>
<br>
However, it seems that there are better ways to do it to get a more balanced load. Do I use DMDA in PETSc to do it? I'm now using staggered Cartesian grids for my u,v,p. Is there an example to construct a Laplace/Poisson equation using DMDA?<br>
</blockquote><div><br></div><div>Yes use DMDA. Look at SNES ex5 and ex50.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
It is mentioned in the manual that DMMG infrastructure will be replaced in the next release and we should not use it. Is this related to DMDA?<br></blockquote><div><br></div><div>Only if you use multigrid.</div><div><br>
</div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Yours sincerely,<br><font color="#888888">
<br>
TAY wee-beng<br>
<br>
<br>
</font></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>