[petsc-users] h-FGMRES

Gaetan Kenway gaetank at gmail.com
Mon May 20 10:28:17 CDT 2013


Thanks.

On a related note, I tried using the ASM version of the same approach; that
is -pc_type asm  -pc_asm_blocks 4 with the remainder of the options the
same. This gives a message that the number of blocks is less than the
number of processors (sorry I don't have the exact message anymore). I get
this error with both mpiaij and mpibaij types.

Has this approach been implemented/do you think there would be any benefit
from the approach?

Thank you,

Gaetan


On Mon, May 20, 2013 at 10:34 AM, Hong Zhang <hzhang at mcs.anl.gov> wrote:

> Gaetan :
>
>>
>> It runs if the mattype is mpiaij instead of mpibaij. I gather this is not
>> implemented for the blocked matrix types?
>
> It is not tested for mpibaij format yet. I'll check it.
> The paper uses mpiaij format.
>
> Hong
>
>>
>> Gaetan
>>
>> On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway <gaetank at gmail.com> wrote:
>>
>>> Hi again
>>>
>>> I installed petsc3.4.0 and I am still getting the following error when
>>> running with the following options (on 64 procs)
>>>
>>> # Matrix Options
>>> -matload_block_size 5 -mat_type mpibaij
>>>
>>> # KSP solver options
>>> -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor
>>> -ksp_view -ksp_pc_side right -ksp_rtol 1e-6
>>>
>>> # Nested GMRES Options
>>>  -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres
>>> -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu
>>> -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1
>>>
>>> Any thoughts?
>>>
>>> Thank you,
>>>
>>> Gaetan
>>>
>>> [44]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>> probably memory access out of range
>>> [44]PETSC ERROR: Try option -start_in_debugger or
>>> -on_error_attach_debugger
>>> [44]PETSC ERROR: or see
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try
>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
>>> corruption errors
>>> [44]PETSC ERROR: likely location of problem given in stack below
>>> [44]PETSC ERROR: ---------------------  Stack Frames
>>> ------------------------------------
>>> [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>> available,
>>> [44]PETSC ERROR:       INSTEAD the line number of the start of the
>>> function
>>> [44]PETSC ERROR:       is given.
>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197
>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c
>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24
>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c
>>> [44]PETSC ERROR: [44] PCSetUp line 868
>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c
>>> [44]PETSC ERROR: [44] KSPSetUp line 192
>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c
>>> [44]PETSC ERROR: [44] KSPSolve line 356
>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c
>>> [43]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>>
>>>
>>> On Sun, May 19, 2013 at 11:15 PM, Barry Smith <bsmith at mcs.anl.gov>wrote:
>>>
>>>>
>>>>    You should be using PETSc version 3.4 which was recently released
>>>> and is what the paper is based on.
>>>>
>>>>     Barry
>>>>
>>>> On May 19, 2013, at 10:11 PM, Gaetan Kenway <gaetank at gmail.com> wrote:
>>>>
>>>> > Hi Everyone
>>>> >
>>>> > I am trying to replicate the type of preconditioner described in
>>>> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing".
>>>> >
>>>> > I have used the following options: (I'm using fortran so the
>>>> following is my petsc_options file)
>>>> >
>>>> > # Matrix Options
>>>> > -matload_block_size 5
>>>> > -mat_type mpibaij
>>>> >
>>>> > # KSP solver options
>>>> > -ksp_type gmres
>>>> > -ksp_max_it 1000
>>>> > -ksp_gmres_restart 200
>>>> > -ksp_monitor
>>>> > -ksp_view
>>>> > -ksp_pc_side right
>>>> > -ksp_rtol 1e-6
>>>> >
>>>> > # Nested GMRES Options
>>>> > -pc_type bjacobi
>>>> > -pc_bjacobi_blocks 4
>>>> > -sub_ksp_type gmres
>>>> > -sub_ksp_max_it 5
>>>> > -sub_pc_type bjacobi
>>>> > -sub_sub_pc_type ilu
>>>> > -sub_sub_pc_factor_mat_ordering_type rcm
>>>> > -sub_sub_pc_factor_levels 1
>>>> >
>>>> > The test is run on 64 processors and the total  number of block
>>>> jacobi blocks is 4 (less than nproc). The error I get is:
>>>> >
>>>> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>>> available,
>>>> > [6]PETSC ERROR:       INSTEAD the line number of the start of the
>>>> function
>>>> > [6]PETSC ERROR:       is given.
>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c
>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c
>>>> > [6]PETSC ERROR: [6] PCSetUp line 810
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c
>>>> > [6]PETSC ERROR: [6] KSPSetUp line 182
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c
>>>> > [6]PETSC ERROR: [6] KSPSolve line 351
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c
>>>> > [6]PETSC ERROR: --------------------- Error Message
>>>> ------------------------------------
>>>> > [6]PETSC ERROR: Signal received!
>>>> > [6]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec  1
>>>> 15:10:41 CST 2012
>>>> > [6]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>> > [6]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> > [6]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway
>>>> Sun May 19 23:01:52 2013
>>>> > [6]PETSC ERROR: Libraries linked from
>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib
>>>> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013
>>>> > [6]PETSC ERROR: Configure options --with-shared-libraries
>>>> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes
>>>> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real
>>>> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic
>>>> > [6]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> >
>>>> > If the number of blocks is greater than or equal to the number of
>>>> processors it runs fine.  I'm using version 3.3-p5.
>>>> >
>>>> > The options as listed in the paper are:
>>>> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi
>>>> -flow_pc_bjacobi_blocks ngp
>>>> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type
>>>> bjacobi
>>>> > -flow_sub_sub_pc_type ilu
>>>> >
>>>> > Any suggestions would be greatly appreciated.
>>>> >
>>>> > Thank you,
>>>> >
>>>> > Gaetan Kenway
>>>> >
>>>> >
>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130520/7434ff50/attachment-0001.html>


More information about the petsc-users mailing list