Hi Everyone<div><br></div><div>I am trying to replicate the type of preconditioner described in "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". </div><div><br></div><div>I have used the following options: (I'm using fortran so the following is my petsc_options file)</div>
<div><div><br></div><div># Matrix Options</div><div>-matload_block_size 5</div><div>-mat_type mpibaij</div><div><br></div><div># KSP solver options</div><div>-ksp_type gmres</div><div>-ksp_max_it 1000</div><div>-ksp_gmres_restart 200</div>
<div>-ksp_monitor</div><div>-ksp_view</div><div>-ksp_pc_side right</div><div>-ksp_rtol 1e-6</div><div><br></div><div># Nested GMRES Options</div><div>-pc_type bjacobi</div><div>-pc_bjacobi_blocks 4</div><div>-sub_ksp_type gmres</div>
<div>-sub_ksp_max_it 5</div><div>-sub_pc_type bjacobi</div><div>-sub_sub_pc_type ilu</div><div>-sub_sub_pc_factor_mat_ordering_type rcm</div><div>-sub_sub_pc_factor_levels 1</div></div><div><br></div><div>The test is run on 64 processors and the total number of block jacobi blocks is 4 (less than nproc). The error I get is: </div>
<div><br></div><div><div>[6]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,</div><div>[6]PETSC ERROR: INSTEAD the line number of the start of the function</div><div>[6]PETSC ERROR: is given.</div>
<div>[6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c</div><div>[6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c</div>
<div>[6]PETSC ERROR: [6] PCSetUp line 810 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c</div><div>[6]PETSC ERROR: [6] KSPSetUp line 182 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c</div>
<div>[6]PETSC ERROR: [6] KSPSolve line 351 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c</div><div>[6]PETSC ERROR: --------------------- Error Message ------------------------------------</div>
<div>[6]PETSC ERROR: Signal received!</div><div>[6]PETSC ERROR: ------------------------------------------------------------------------</div><div>[6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 15:10:41 CST 2012 </div>
<div>[6]PETSC ERROR: See docs/changes/index.html for recent updates.</div><div>[6]PETSC ERROR: See docs/faq.html for hints about trouble shooting.</div><div>[6]PETSC ERROR: ------------------------------------------------------------------------</div>
<div>[6]PETSC ERROR: ------------------------------------------------------------------------</div><div>[6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun May 19 23:01:52 2013</div><div>[6]PETSC ERROR: Libraries linked from /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib</div>
<div>[6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013</div><div>[6]PETSC ERROR: Configure options --with-shared-libraries --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic</div>
<div>[6]PETSC ERROR: ------------------------------------------------------------------------</div></div><div><br></div><div>If the number of blocks is greater than or equal to the number of processors it runs fine. I'm using version 3.3-p5. </div>
<div><br></div><div>The options as listed in the paper are:</div><div><div>-flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi -flow_pc_bjacobi_blocks ngp</div><div>-flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type bjacobi</div>
<div>-flow_sub_sub_pc_type ilu</div></div><div><br></div><div>Any suggestions would be greatly appreciated. </div><div><br></div><div>Thank you,</div><div><br></div><div>Gaetan Kenway</div><div><br></div><div><br></div>