Thanks. <div><br></div><div>On a related note, I tried using the ASM version of the same approach; that is -pc_type asm -pc_asm_blocks 4 with the remainder of the options the same. This gives a message that the number of blocks is less than the number of processors (sorry I don't have the exact message anymore). I get this error with both mpiaij and mpibaij types. </div>
<div><br></div><div>Has this approach been implemented/do you think there would be any benefit from the approach?</div><div><br></div><div>Thank you,</div><div><br></div><div>Gaetan<br><div><br></div><br><div class="gmail_quote">
On Mon, May 20, 2013 at 10:34 AM, Hong Zhang <span dir="ltr"><<a href="mailto:hzhang@mcs.anl.gov" target="_blank">hzhang@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr"><div class="gmail_extra">Gaetan :<div class="gmail_quote"><div class="im"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><br></div>It runs if the mattype is mpiaij instead of mpibaij. I gather this is not implemented for the blocked matrix types?</blockquote>
</div><div>It is not tested for mpibaij format yet. I'll check it.</div><div>The paper uses mpiaij format.</div><span class="HOEnZb"><font color="#888888"><div><br></div><div>Hong </div></font></span><div><div class="h5">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span><font color="#888888"><div><br></div><div>Gaetan</div></font></span><div><div><div><br><div class="gmail_quote">On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway <span dir="ltr"><<a href="mailto:gaetank@gmail.com" target="_blank">gaetank@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div>Hi again</div><div><br></div><div>I installed petsc3.4.0 and I am still getting the following error when running with the following options (on 64 procs)</div>
<div><div><div><br></div><div># Matrix Options</div><div>-matload_block_size 5 -mat_type mpibaij</div>
<div><br></div><div># KSP solver options</div></div><div>-ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 </div><div><div><br></div><div># Nested GMRES Options</div>
<div>
-pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1</div></div></div><div><br></div><div>
Any thoughts?</div>
<div><br></div><div>Thank you,</div><div><br></div><div>Gaetan</div><div><br></div><div>[44]PETSC ERROR: ------------------------------------------------------------------------</div><div>[44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range</div>
<div>[44]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger</div><div>[44]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSC</a> ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors</div>
<div>[44]PETSC ERROR: likely location of problem given in stack below</div><div>[44]PETSC ERROR: --------------------- Stack Frames ------------------------------------</div><div>[44]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,</div>
<div>[44]PETSC ERROR: INSTEAD the line number of the start of the function</div><div>[44]PETSC ERROR: is given.</div><div>[44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c</div>
<div>[44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c</div><div>[44]PETSC ERROR: [44] PCSetUp line 868 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c</div>
<div>[44]PETSC ERROR: [44] KSPSetUp line 192 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c</div><div>[44]PETSC ERROR: [44] KSPSolve line 356 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c</div>
<div>[43]PETSC ERROR: ------------------------------------------------------------------------</div></div><div><br></div><br><div class="gmail_quote"><div>On Sun, May 19, 2013 at 11:15 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br>
</div><div><div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
You should be using PETSc version 3.4 which was recently released and is what the paper is based on.<br>
<span><font color="#888888"><br>
Barry<br>
</font></span><div><div><br>
On May 19, 2013, at 10:11 PM, Gaetan Kenway <<a href="mailto:gaetank@gmail.com" target="_blank">gaetank@gmail.com</a>> wrote:<br>
<br>
> Hi Everyone<br>
><br>
> I am trying to replicate the type of preconditioner described in "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing".<br>
><br>
> I have used the following options: (I'm using fortran so the following is my petsc_options file)<br>
><br>
> # Matrix Options<br>
> -matload_block_size 5<br>
> -mat_type mpibaij<br>
><br>
> # KSP solver options<br>
> -ksp_type gmres<br>
> -ksp_max_it 1000<br>
> -ksp_gmres_restart 200<br>
> -ksp_monitor<br>
> -ksp_view<br>
> -ksp_pc_side right<br>
> -ksp_rtol 1e-6<br>
><br>
> # Nested GMRES Options<br>
> -pc_type bjacobi<br>
> -pc_bjacobi_blocks 4<br>
> -sub_ksp_type gmres<br>
> -sub_ksp_max_it 5<br>
> -sub_pc_type bjacobi<br>
> -sub_sub_pc_type ilu<br>
> -sub_sub_pc_factor_mat_ordering_type rcm<br>
> -sub_sub_pc_factor_levels 1<br>
><br>
> The test is run on 64 processors and the total number of block jacobi blocks is 4 (less than nproc). The error I get is:<br>
><br>
> [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> [6]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> [6]PETSC ERROR: is given.<br>
> [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c<br>
> [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c<br>
> [6]PETSC ERROR: [6] PCSetUp line 810 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c<br>
> [6]PETSC ERROR: [6] KSPSetUp line 182 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c<br>
> [6]PETSC ERROR: [6] KSPSolve line 351 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c<br>
> [6]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
> [6]PETSC ERROR: Signal received!<br>
> [6]PETSC ERROR: ------------------------------------------------------------------------<br>
> [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 15:10:41 CST 2012<br>
> [6]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
> [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
> [6]PETSC ERROR: ------------------------------------------------------------------------<br>
> [6]PETSC ERROR: ------------------------------------------------------------------------<br>
> [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun May 19 23:01:52 2013<br>
> [6]PETSC ERROR: Libraries linked from /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib<br>
> [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013<br>
> [6]PETSC ERROR: Configure options --with-shared-libraries --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic<br>
> [6]PETSC ERROR: ------------------------------------------------------------------------<br>
><br>
> If the number of blocks is greater than or equal to the number of processors it runs fine. I'm using version 3.3-p5.<br>
><br>
> The options as listed in the paper are:<br>
> -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi -flow_pc_bjacobi_blocks ngp<br>
> -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type bjacobi<br>
> -flow_sub_sub_pc_type ilu<br>
><br>
> Any suggestions would be greatly appreciated.<br>
><br>
> Thank you,<br>
><br>
> Gaetan Kenway<br>
><br>
><br>
<br>
</div></div></blockquote></div></div></div><br>
</blockquote></div><br></div>
</div></div></blockquote></div></div></div><br></div></div>
</blockquote></div><br></div>