[petsc-users] Solving Poisson equation with multigrid
Michele Rosso
mrosso at uci.edu
Fri May 24 14:51:12 CDT 2013
Hi Jed,
I followed your suggestion by using:
-pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
This works perfectly if I have a non-singular matrix. When instead I use
periodic conditions for my system ( I set the nullspace removal
correctly ),
I receive an error saying a zero pivot is detected in the LU
factorization. So, after some research, I found in the mailinglist a fix :
-pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
-mg_coarse_pc_factor_shift_nonzero
Still I am receiving the following error
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: Detected zero pivot in LU factorization:
see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
[0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance 2.22045e-14!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29
11:26:24 CDT 2012
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24
14:40:48 2013
[0]PETSC ERROR: Libraries linked from
[0]PETSC ERROR: Configure run at
[0]PETSC ERROR: Configure options
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: MatPivotCheck_none() line 583 in
src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
[0]PETSC ERROR: MatPivotCheck() line 602 in
src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
[0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in
src/mat/impls/aij/seq/aijfact.c
[0]PETSC ERROR: MatLUFactorNumeric() line 2803 in src/mat/interface/matrix.c
[0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in
src/ksp/pc/impls/bjacobi/bjacobi.c
[0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in
src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c
[0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c
What could the reason be?
Thank you,
Michele
On 05/17/2013 07:35 PM, Michele Rosso wrote:
> Thank you very much. I will try and let you know.
>
> Michele
>
> On 05/17/2013 07:01 PM, Jed Brown wrote:
>> Michele Rosso<mrosso at uci.edu> writes:
>>
>>> I noticed that the problem appears even if I use CG with the default
>>> preconditioner: commenting KSPSetDM() solves the problem.
>> Okay, this issue can't show up if you use SNES, but it's a consequence
>> of making geometric multigrid work with a pure KSP interface. You can
>> either use KSPSetComputeOperators() to put your assembly in a function
>> (which will also be called on coarse levels if you use -pc_type mg
>> without Galerkin coarse operators) or you can can provide the Jacobian
>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so that
>> the DM is not used for computing/updating the Jacobian.
>>
>> The logic is cleaner in petsc-3.4 and I think it just does the right
>> thing in your case.
>>
>>> So basically without a proper grid (it seems no grid with an even
>>> numbers of nodes qualifies) and with my own system matrix, I cannot use
>>> any type of multigrid
>>> pre-conditioner?
>> You can use all the AMG methods without setting a DM.
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130524/b7097d32/attachment-0001.html>
More information about the petsc-users
mailing list