[petsc-dev] Algebraic Multigrid

Stefano Zampini stefano.zampini at gmail.com
Wed Aug 17 17:38:00 CDT 2016


AMS needs the discrete gradient and other information. See the hypre docs.
To use the petsc interface, see

https://bitbucket.org/fenics-project/dolfin/src/4ad8205f700328117a659b2fca71202ab84d06f8/demo/undocumented/curl-curl/cpp/main.cpp?at=master

Here is another example on how to setup the hypre solver

https://github.com/mfem/mfem/blob/master/linalg/hypre.cpp

Il 18 ago 2016 00:04, "Mark Adams" <mfadams at lbl.gov> ha scritto:

>
>
> On Wed, Aug 17, 2016 at 4:33 PM, Ari Rappaport <arir at vosssci.com> wrote:
>
>> Ok, thank you. So for AMS there is the option to use
>> PCHYPRESetEdgeConstantVectors. Is there any example somewhere of how this
>> function is supposed to be used?
>
>
> This is probably a wrapper for a hypre method so I'd check the hypre docs.
>
>
>> Also, it is possible to set the pc type to hypre inside the code? e.g.
>> PCSetType(pc, <HYPRE>)
>>
>>
> yes
>
>
>> -Ari
>>
>> ----- Original Message -----
>> From: "Satish Balay" <balay at mcs.anl.gov>
>> To: "Ari Rappaport" <arir at vosssci.com>
>> Cc: "petsc-dev" <petsc-dev at mcs.anl.gov>
>> Sent: Wednesday, August 17, 2016 1:11:43 PM
>> Subject: Re: [petsc-dev] Algebraic Multigrid
>>
>> balay at asterix /home/balay/petsc/src/ksp/ksp/examples/tutorials (master=)
>> $ ./ex2 -pc_type hypre -h |grep pc_hypre_type
>>   -pc_hypre_type <boomeramg> (choose one of) pilut parasails boomeramg
>> ams (PCHYPRESetType)
>>
>>
>> Satish
>>
>> On Wed, 17 Aug 2016, Ari Rappaport wrote:
>>
>> > Hi again,
>> > I am having a problem with the database option -pc_hypre_set_type ams
>> along with -pc_type hypre. When I run my simulation I get the message
>> "Option left: name:-pc_hypre_set_type value: ams".
>> >
>> > -Ari
>> >
>> > ----- Original Message -----
>> > From: "Satish Balay" <balay at mcs.anl.gov>
>> > To: "petsc-dev" <petsc-dev at mcs.anl.gov>
>> > Cc: "Ari Rappaport" <arir at vosssci.com>
>> > Sent: Tuesday, August 16, 2016 3:31:16 PM
>> > Subject: Re: [petsc-dev] Algebraic Multigrid
>> >
>> > Or - not use '--with-fc=0'. For some reason '-lm' is getting pulled in
>> via gfortran - not gcc [on this machine]
>> >
>> > I've pushed the following fix - to maint/3.7
>> >
>> > https://bitbucket.org/petsc/petsc/commits/2d90af56370c315336
>> 0ed340bfa06d4e662fceff
>> >
>> > Satish
>> >
>> > On Tue, 16 Aug 2016, Satish Balay wrote:
>> >
>> > > >>
>> > > /home/mantis/petsc/petsc-3.6.4/linux-gnu/externalpackages/hy
>> pre-2.10.0b-p4/src/utilities/exchange_data.c:205: undefined reference to
>> `ceil'
>> > > <<
>> > >
>> > > Try adding configure flag: LIBS=-lm
>> > >
>> > > BTW:
>> > >
>> > > >>>>>>>
>> > > Working directory: /home/mantis/petsc/petsc-3.6.4
>> > >
>> > > Executing: mpicc -show
>> > > stdout: gcc -fPIC -g3 -I/home/mantis/petsc/petsc-3.7.2/linux-gnu/include
>> -L/home/mantis/petsc/petsc-3.7.2/linux-gnu/lib -Wl,-rpath
>> -Wl,/home/mantis/petsc/petsc-3.7.2/linux-gnu/lib -Wl,--enable-new-dtags
>> -lmpi
>> > > <<<<<<<
>> > >
>> > > Mixing petsc versions might cause issues..
>> > >
>> > > Satish
>> > >
>> > > On Tue, 16 Aug 2016, Ari Rappaport wrote:
>> > >
>> > > > Hi Barry,
>> > > > Sorry for the late reply. Here is my configure.log file. I am
>> assuming you meant the configure.log in the petsc-3.6.4 home directory.
>> > > >
>> > > > -Ari
>> > > >
>> > > > ----- Original Message -----
>> > > > From: "Barry Smith" <bsmith at mcs.anl.gov>
>> > > > To: "Ari Rappaport" <arir at vosssci.com>
>> > > > Cc: "Mark Adams" <mfadams at lbl.gov>, "For users of the development
>> version of PETSc" <petsc-dev at mcs.anl.gov>
>> > > > Sent: Thursday, August 11, 2016 3:59:47 PM
>> > > > Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > >
>> > > >
>> > > > > On Aug 11, 2016, at 4:42 PM, Ari Rappaport <arir at vosssci.com>
>> wrote:
>> > > > >
>> > > > > Hi Guys,
>> > > > > So the algebraic multigrid works great for a Poisson equation.
>> However, it's choking on our Maxwell's Equations solver which is a form of
>> curl/curl if I'm not mistaken.
>> > > >
>> > > >    Maxwell's equations are tricky for multigrid. hypre has a solver
>> specific for Maxwell's equation, you need to check it out; note this is not
>> just running with hypre boomerAMG. If you run boomerAMG directly with
>> Maxwell's you will not get satisfactory results.
>> > > >
>> > > > > I was going to give hypre AMS a try, but I'm having trouble
>> installing it. I'm getting the error "Downloaded hypre could not be used".
>> > > >
>> > > >   Send configure.log
>> > > >
>> > > >
>> > > > > I'm using Ubuntu 14.04 if that makes a difference. Also, is
>> non-algebraic multigrid a reasonable choice for Maxwell's Equations?
>> > > >
>> > > >    Even with geometric multigrid Maxwell's equations are
>> nontrivial; you will need to do some literature search to determine how to
>> handle the restriction/interpolation and smoothing to work well for Maxwell.
>> > > >
>> > > >    Barry
>> > > >
>> > > > >
>> > > > > Thanks,
>> > > > > Ari
>> > > > >
>> > > > > ----- Original Message -----
>> > > > > From: "Barry Smith" <bsmith at mcs.anl.gov>
>> > > > > To: "Ari Rappaport" <arir at vosssci.com>
>> > > > > Cc: "Mark Adams" <mfadams at lbl.gov>, "For users of the
>> development version of PETSc" <petsc-dev at mcs.anl.gov>
>> > > > > Sent: Wednesday, August 10, 2016 2:02:10 PM
>> > > > > Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > > >
>> > > > >
>> > > > >> On Aug 10, 2016, at 2:52 PM, Ari Rappaport <arir at vosssci.com>
>> wrote:
>> > > > >>
>> > > > >> Hi Mark,
>> > > > >> There was indeed a bug on our end. Now that we fixed it
>> everything is working correctly even in parallel. I have one question still
>> though. Is there a way to get the mg_levels_pc_type jacobi option
>> > > > >> set in the code without passing it as an argument to
>> PetscInitialize? I'm using PCGAMGSetType(pc, PCJACOBI) but it doesn't seem
>> to be working.
>> > > > >
>> > > > >   You can call PetscOptionsSetValue("-<appropriate
>> prefix>_mg_levels_pc_type","jacobi") in the code anywhere before your
>> code creates the SNES or KSP object you are using. This is equivalent to
>> putting it on the command line.
>> > > > >
>> > > > >  Barry
>> > > > >
>> > > > >>
>> > > > >>
>> > > > >> -Ari
>> > > > >>
>> > > > >> ----- Original Message -----
>> > > > >> From: "Mark Adams" <mfadams at lbl.gov>
>> > > > >> To: "Ari Rappaport" <arir at vosssci.com>
>> > > > >> Cc: "For users of the development version of PETSc" <
>> petsc-dev at mcs.anl.gov>
>> > > > >> Sent: Wednesday, July 27, 2016 7:24:24 PM
>> > > > >> Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > > >>
>> > > > >> There is clearly something very wrong with your code. I would
>> suggest starting with an example code for instance
>> KSP/examples/tutorial/ex56.c and incrementally add your operator to get
>> something that is working.
>> > > > >>
>> > > > >>
>> > > > >> Also if you have multiple degrees of freedom per vertex, for
>> instance elasticity, then you want to set the block size of the matrix
>> accordingly.
>> > > > >>
>> > > > >> On Wednesday, July 27, 2016, Ari Rappaport < arir at vosssci.com >
>> wrote:
>> > > > >>
>> > > > >>
>> > > > >> Hi Mark,
>> > > > >> We added the Jacobi line and it appears to accept that flag now.
>> However, we are getting all zeros for the solution vector. And PETSc is
>> claiming to have converged in 7 iterations to the relative tolerance.
>> > > > >>
>> > > > >> -Ari
>> > > > >>
>> > > > >> ----- Original Message -----
>> > > > >> From: "Mark Adams" < mfadams at lbl.gov >
>> > > > >> To: "Ari Rappaport" < arir at vosssci.com >, "For users of the
>> development version of PETSc" < petsc-dev at mcs.anl.gov >
>> > > > >> Sent: Wednesday, July 27, 2016 3:26:03 PM
>> > > > >> Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > > >>
>> > > > >>
>> > > > >> Please keep this on the Petsc list.
>> > > > >>
>> > > > >> We seem to have lost Jacobi smoother again. I'm suspecting that
>> you're some funny character in your line with Jacobi that is Confusing the
>> parser. Get back to the old file with the two two Jacobi entries and delete
>> the other line and get Jacobi in the KSP view output. There should be no
>> SOR in the output.
>> > > > >>
>> > > > >>
>> > > > >> On Wednesday, July 27, 2016, Ari Rappaport < arir at vosssci.com >
>> wrote:
>> > > > >>
>> > > > >>
>> > > > >> Hi Mark,
>> > > > >> I added all these new things. The PCAMG is now ending very
>> quickly but the residual is unreasonably large by about 10 orders of
>> magnitude. I noticed the line "Linear solve did not converge due to
>> DIVERGED_INDEFINITE_PC iterations 2" in the output, could this be causing
>> the problem? It only appears to be going for 2 iterations now.
>> > > > >>
>> > > > >> -Ari
>> > > > >>
>> > > > >> ----- Original Message -----
>> > > > >> From: "Mark Adams" < mfadams at lbl.gov >
>> > > > >> To: "Ari Rappaport" < arir at vosssci.com >
>> > > > >> Cc: "For users of the development version of PETSc" <
>> petsc-dev at mcs.anl.gov >
>> > > > >> Sent: Tuesday, July 26, 2016 5:07:34 PM
>> > > > >> Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > > >>
>> > > > >>
>> > > > >> Ari, I would also check that your operator is not messed up in
>> parallel. The solver is looking pretty solid.
>> > > > >>
>> > > > >>
>> > > > >> Also, you can configure PETSc with hypre and use '-pc_type
>> hypre'. If hypre is also good in serial but hosed on multi-proc then it is
>> most probably your operator.
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> On Tue, Jul 26, 2016 at 6:58 PM, Mark Adams < mfadams at lbl.gov >
>> wrote:
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> So remove one of the -mg_levels_pc_type jacobi and add
>> -mg_coarse_ksp_type preonly, then verify that this works on one proc and
>> then try two procs.
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> On Tue, Jul 26, 2016 at 6:56 PM, Mark Adams < mfadams at lbl.gov >
>> wrote:
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> Oh, actually this worked. You have this ...pc_type jacobi in
>> there twice, so one of them was "unused".
>> > > > >>
>> > > > >>
>> > > > >> Try this with 2 processors now.
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> On Tue, Jul 26, 2016 at 6:42 PM, Mark Adams < mfadams at lbl.gov >
>> wrote:
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> On Tue, Jul 26, 2016 at 6:24 PM, Ari Rappaport <
>> arir at vosssci.com > wrote:
>> > > > >>
>> > > > >>
>> > > > >> So I commented out the line PCSetType(pc, PCGAMG). The line
>> KSPSetFromOptions(ksp) was already in the code at the end of our
>> initialization routine. I also added .petscrc to the working dir. Here is
>> the current output. It seems as if Option left: name:-mg_levels_pc_type
>> jacobi (no value) is still present in the output..I dunno.
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> Yea, I dunno either. If you use -help you will get printout of
>> the available options. If you do this you will see stuff like -mg_levels_1_
>> ... you can also see this in the ksp_view. There is a shortcut that lets
>> you _not_ put "_1" in. Try putting this in for each level like so:
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> -mg_levels_1_pc_type jacobi
>> > > > >>
>> > > > >> -mg_levels_2_pc_type jacobi
>> > > > >> -mg_levels_3_pc_type jacobi
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> I also notice that the coarse grid ksp is GMRES. This is our
>> fault. It should be preonly. Add:
>> > > > >>
>> > > > >>
>> > > > >> -mg_coarse_ksp_type preonly
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> -Ari
>> > > > >>
>> > > > >> ----- Original Message -----
>> > > > >> From: "Mark Adams" < mfadams at lbl.gov >
>> > > > >> To: "Ari Rappaport" < arir at vosssci.com >, "For users of the
>> development version of PETSc" < petsc-dev at mcs.anl.gov >
>> > > > >> Sent: Tuesday, July 26, 2016 4:03:03 PM
>> > > > >> Subject: Re: [petsc-dev] Algebraic Multigrid
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> At the end of this you have:
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> #PETSc Option Table entries:
>> > > > >> -ksp_view
>> > > > >> -mg_levels_pc_type jacobi
>> > > > >> -options_left
>> > > > >> #End of PETSc Option Table entries
>> > > > >> There is one unused database option. It is:
>> > > > >> Option left: name:-mg_levels_pc_type jacobi (no value)
>> > > > >>
>> > > > >>
>> > > > >> So this jacobi parameter is not being used.
>> > > > >>
>> > > > >>
>> > > > >> Do you call KPSSetFromOptions? Do you set solver parameters in
>> the code? Like PCGAMG?
>> > > > >>
>> > > > >>
>> > > > >> You should not set anything in the code, it just confuses things
>> at this point. Use KSPSetFromOptions(). You can hardwire stuff before this
>> call, this just lets you set the defaults, but you should always call this
>> last to let command line parameters override the defaults.
>> > > > >>
>> > > > >>
>> > > > >> You can put this in a .petscrc file in the working directory and
>> try again.
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >> -ksp_type cg
>> > > > >> -ksp_max_it 50
>> > > > >>
>> > > > >> -ksp_rtol 1.e-6
>> > > > >> -ksp_converged_reason
>> > > > >> -pc_type gamg
>> > > > >> -pc_gamg_type agg
>> > > > >> -pc_gamg_agg_nsmooths 1
>> > > > >> -pc_gamg_coarse_eq_limit 10
>> > > > >> -pc_gamg_reuse_interpolation true
>> > > > >> -pc_gamg_square_graph 1
>> > > > >> -pc_gamg_threshold -0.05
>> > > > >> -mg_levels_ksp_max_it 2
>> > > > >> -mg_levels_ksp_type chebyshev
>> > > > >> -mg_levels_esteig_ksp_type cg
>> > > > >> -mg_levels_esteig_ksp_max_it 10
>> > > > >> -mg_levels_ksp_chebyshev_esteig 0,.05,0,1.05
>> > > > >> -mg_levels_pc_type jacobi
>> > > > >> -pc_hypre_type boomeramg
>> > > > >> -pc_hypre_boomeramg_no_CF
>> > > > >> -pc_hypre_boomeramg_agg_nl 1
>> > > > >> -pc_hypre_boomeramg_coarsen_type HMIS
>> > > > >> -pc_hypre_boomeramg_interp_type ext+i
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >>
>> > > > >
>> > > >
>> > > >
>> > >
>> > >
>> >
>> >
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20160817/3845aded/attachment.html>


More information about the petsc-dev mailing list