[petsc-users] unreliable AMG in PETSc

Barry Smith bsmith at mcs.anl.gov
Wed Oct 22 16:17:53 CDT 2014


  Arnem,

   I was able to reproduce your failures. ml is a rather old code from Sandia that is not getting new development (they are working on a complete replacement for several years). 

   I ran you tests using the PETSc AMG solver -pc_type gamg (written largely by Mark Adams) using its default options and it converged for all your cases from 1 to 32 processes with no failure of positive definiteness etc.  

   That said the convergence rate is not super-great, around 70+ iterations and I did not run for timing comparisons. But

75 iterations on 1 processors
76 iterations on 16 processors
78 iterations on 32 processors

  Do not run it with PETSc 3.4, only with PETSc 3.5

  I would suggest you run with gamg and send us reports on problems that come up and drop ml. We are actively interested in improving gamg based on your feedback.

  Barry


> On Oct 22, 2014, at 8:21 AM, Arne Morten Kvarving <arne.morten.kvarving at sintef.no> wrote:
> 
> hi there.
> 
> we (sintef ict) use (parts of) PETSc in our (isogeometric) finite element library.
> sadly i find that the AMG in petsc is somewhat unreliable and it's a pain to deal with users (students) due to this fact.
> 
> now, we all know that linear algebra, and AMG in particular, is more art than craft at times, certainly to the uneducated, and you can only do this much to automate it. but segfaults are bad. preconditioners which break the symmetry of the systems are bad. preconditioners that breaks the definiteness of systems is also bad. and worse; here it depends on the number of MPI processes. meshes are typically targeted a given world size and if you are unlucky, this number is one that does not work.
> 
> as an example to illustrate this, consider
> 
> http://www.math.ntnu.no/~arnemort/petsc-amg-bug.tar.bz2
> 
> it's about 30MB due to a somewhat large matrix, hopefully your pipes will survive.
> 
> this is a (isogeometric) standard Poisson problem using cubic splines, with homogenous dirichlet boundaries and a uniform source (f = 1) on a mesh intended for simulating flow around wings, but i used the Poisson to keep it simple. see
> 
> http://www.math.ntnu.no/~arnemort/wing.png
> 
> if you want to get an idea.
> 
> the tarball has the matrix and vector, as well as a cmake-based buildsystem for a simple test app (hacked ex23.c). it contains a number of tests (64 in total) to illustrate the unreliable behavior.
> in particular it solves the stored linear system using 1 through 32 processors, with either amg + sor or asm + ilu. the former fails and crashes for several runs, while the latter runs through for all (there to illustrate that the systems are solvable so that it is not drawn into question).
> 
> i have tried this with v3.4.2, v3.4.4 and v3.5.2 - they all fail. in v3.4.2 one more test fails compared to the others. i have also confirmed that i can produce across systems and across toolchains (gcc 4.8 and icc 14.0). the fails i get are
> 
> 27:amg_14
> 39:amg_20
> 41:amg_21
> 43:amg_22
> 45:amg_23
> 49:amg_25
> 51:amg_26
> 53:amg_27
> 55:amg_28
> 57:amg_29
> 61:amg_31
> 63:amg_32
> 
> do not put much thought into the configurations, or the high iteration counts. in the "real" code stronger smoothers etc are employed. but again i wanted to keep it simple.
> 
> i do suspect that eventually the cause is ML. disclaimer: i have never been able to (find the energy to) decode the "slightly unreadable" code of petsc so i haven't tried to dig.
> 
> i do realize this "bug report" has a number of holes (as far as the information it contains is entailed), and i can provide whatever you need upon request. but i have already sunk too much time into this and are behind on my projects :)
> 
> usage instructions below.
> 
> regards,
> 
> arnem
> 
> ---
> 
> usage: requires petsc >= 3.4, some c++ compiler and cmake.
> 
> export PETSC_DIR=<dir to petsc>
> export PETSC_ARCH=<petsc configuration>
> 
> basic usage:
> tar jxvf petsc-amg-bug.tar.bz2
> cd petsc-amg-bug
> cmake . -DCMAKE_BUILD_TYPE=Release
> make
> make test
> 
> for verbose output of the process - replace make test with 'ctest -V'
> 
> if your system does not provide the mpi spawner as 'mpirun', you can pass -DMPIRUN_COMMAND=<path-to-binary> to cmake. DO NOTE: you cannot use petscmpiexec as that does not return the error code to the terminal (probably should report that as a separate bug). or well, you can use it, but if you do, a failed tests with be reported as a success.
> 
> you can execute particular tests only using ctest -R <re to match> -V
> 
> to run only the amg tests do
> 
> ctest -R amg.* -V
> 
> to run only a given amg test do
> 
> ctest -R amg_5 -V



More information about the petsc-users mailing list