[petsc-users] Solving Poisson equation with multigrid
Matthew Knepley
knepley at gmail.com
Fri May 24 16:37:24 CDT 2013
On Fri, May 24, 2013 at 4:35 PM, Michele Rosso <mrosso at uci.edu> wrote:
> I tried
>
> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
> -mg_coarse_sub_pc_factor_shift_nonzero
>
> but I still get
>
> [0]PETSC ERROR: Detected zero pivot in LU factorization:
> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
> [0]PETSC ERROR: Zero pivot row 280 value 6.58999e-17 tolerance 2.22045e-14!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24
> CDT 2012
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:08:33
> 2013
> [0]PETSC ERROR: Libraries linked from
> [0]PETSC ERROR: Configure run at
> [0]PETSC ERROR: Configure options
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: MatPivotCheck_none() line 583 in
> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
> [0]PETSC ERROR: MatPivotCheck() line 602 in
> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in
> src/mat/impls/aij/seq/aijfact.c
> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in
> src/mat/interface/matrix.c
> [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c
> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in
> src/ksp/pc/impls/bjacobi/bjacobi.c
> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in
> src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c
> [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c
>
>
> If instead I use
>
> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
> -mg_coarse_pc_type svd
>
> as Matthew suggested, I am told that there is an invalid argument.
>
1) When you send these in, we need to see -ksp_view, so we know what is
begin used
2) This is not enough information above. I use this all the time, or I
would not have suggested it
Matt
> Michele
>
>
>
>
>
>
>
>
>
>
>
> On 05/24/2013 01:04 PM, Matthew Knepley wrote:
>
> On Fri, May 24, 2013 at 2:55 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> Michele Rosso <mrosso at uci.edu> writes:
>>
>> > Hi Jed,
>> >
>> > I followed your suggestion by using:
>> >
>> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
>> >
>> > This works perfectly if I have a non-singular matrix. When instead I use
>> > periodic conditions for my system ( I set the nullspace removal
>> > correctly ),
>> > I receive an error saying a zero pivot is detected in the LU
>> > factorization. So, after some research, I found in the mailinglist a
>> fix :
>> >
>> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
>> > -mg_coarse_pc_factor_shift_nonzero
>>
>> It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero
>>
>> With petsc-3.4 (which you should upgrade to), use
>> -mg_coarse_sub_pc_factor_shift_type NONZERO
>>
>> The reason you need this "sub" prefix is that the code always restricts
>> using block Jacobi (usually localized so that all the entries are in one
>> block), before applying the direct coarse solver.
>
>
> I think this is less elegant than
>
> -mg_coarse_pc_type svd
>
> Matt
>
>
>> > Still I am receiving the following error
>> >
>> >
>> > [0]PETSC ERROR: --------------------- Error Message
>> > ------------------------------------
>> > [0]PETSC ERROR: Detected zero pivot in LU factorization:
>> > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
>> > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance
>> 2.22045e-14!
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29
>> > 11:26:24 CDT 2012
>> > [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> > [0]PETSC ERROR: See docs/index.html for manual pages.
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24
>> > 14:40:48 2013
>> > [0]PETSC ERROR: Libraries linked from
>> > [0]PETSC ERROR: Configure run at
>> > [0]PETSC ERROR: Configure options
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: MatPivotCheck_none() line 583 in
>> >
>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
>> > [0]PETSC ERROR: MatPivotCheck() line 602 in
>> >
>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
>> > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in
>> > src/mat/impls/aij/seq/aijfact.c
>> > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in
>> src/mat/interface/matrix.c
>> > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c
>> > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c
>> > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c
>> > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in
>> > src/ksp/pc/impls/bjacobi/bjacobi.c
>> > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in
>> src/ksp/pc/interface/precon.c
>> > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in
>> > src/ksp/ksp/interface/itfunc.c
>> > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c
>> > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c
>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
>> > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c
>> > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c
>> > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c
>> > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c
>> >
>> > What could the reason be?
>> > Thank you,
>> >
>> > Michele
>> >
>> >
>> >
>> > On 05/17/2013 07:35 PM, Michele Rosso wrote:
>> >> Thank you very much. I will try and let you know.
>> >>
>> >> Michele
>> >>
>> >> On 05/17/2013 07:01 PM, Jed Brown wrote:
>> >>> Michele Rosso<mrosso at uci.edu> writes:
>> >>>
>> >>>> I noticed that the problem appears even if I use CG with the default
>> >>>> preconditioner: commenting KSPSetDM() solves the problem.
>> >>> Okay, this issue can't show up if you use SNES, but it's a consequence
>> >>> of making geometric multigrid work with a pure KSP interface. You can
>> >>> either use KSPSetComputeOperators() to put your assembly in a function
>> >>> (which will also be called on coarse levels if you use -pc_type mg
>> >>> without Galerkin coarse operators) or you can can provide the Jacobian
>> >>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so
>> that
>> >>> the DM is not used for computing/updating the Jacobian.
>> >>>
>> >>> The logic is cleaner in petsc-3.4 and I think it just does the right
>> >>> thing in your case.
>> >>>
>> >>>> So basically without a proper grid (it seems no grid with an even
>> >>>> numbers of nodes qualifies) and with my own system matrix, I cannot
>> use
>> >>>> any type of multigrid
>> >>>> pre-conditioner?
>> >>> You can use all the AMG methods without setting a DM.
>> >>>
>> >>
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130524/c052ef04/attachment.html>
More information about the petsc-users
mailing list