<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Fri, Apr 25, 2014 at 11:26 AM, 藤井昭宏 <span dir="ltr"><<a href="mailto:fujii@cc.kogakuin.ac.jp" target="_blank">fujii@cc.kogakuin.ac.jp</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hello everybody,<br>
<br>
I’d like to compare a solver with PETSc-gamg for Poisson problems.<br>
I used PETSc version 3.4.3., and the following command line option.<br>
<br>
-ksp_type “bcgs” -pc_type “gamg” -ksp_monitor -ksp_rtol 1.0E-7 -log_summary -pc_mg_log<br>
<br>
<br>
Would you give me some information on the following questions?<br>
<br>
- Does this option correspond to BICGSTAB solver with smoothed aggregation algebraic multigrid method?<br></blockquote><div><br></div><div>Yes</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
- Are coarser level small distributed matrices re-distributed to reduce the parallelism, when the number of processes is very large?<br></blockquote><div><br></div><div>Yes</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
- What kind of smoother will be used for this command line option?<br></blockquote><div><br></div><div>Cheby(2)/Jacobi is the default in GAMG.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
- pc_mg_log option shows the MG Apply stage. Does this stage time correspond to the solver time except for multi-level setup time?<br>
Does it include the time for BICGSTAB?<br></blockquote><div><br></div><div>Yes, and no.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks in advance.<br>
<br>
Akihiro Fujii<br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>