[petsc-users] Moving from KSPSetNullSpace to MatSetNullSpace

Mark Adams mfadams at lbl.gov
Wed Oct 26 08:38:00 CDT 2016


Please run with -info and grep on GAMG and send that. (-info is very noisy).

I'm not sure what is going on here. Divergence with parallelism.  Here are
some suggestions.

Note, you do not need to set the null space for a scalar (Poisson) problem
unless you have some special null space. And not getting it set (with the 6
rigid body modes) for the velocity (elasticity) equation will only degrade
convergence rates.

There was a bug for a while (early 3.7 versions) where the coarse grid was
not squeezed onto one processor, which could result in very bad
convergence, but not divergence, on multiple processors (the -info output
will report the number of 'active pes'). Perhaps this bug is causing
divergence for you.  We had another subtle bug where the eigen estimates
used a bad seed vector, which gives a bad eigen estimate. This would cause
divergence but it should not be a parallelism issue (these two bugs were
both regressions in around 3.7)

Divergence usually comes from a bad eigen estimate in a Chebyshev smoother,
but this is not highly correlated with parallelism. The -info data will
report the eigen estimates but that is not terribly useful but you can see
if it changes (gets larger) with better parameters. Add these parameters,
with the correct prefix, and use -options_left to make sure that "there are
no unused options":

-mg_levels_ksp_type chebyshev
-mg_levels_esteig_ksp_type cg
-mg_levels_esteig_ksp_max_it 10
-mg_levels_ksp_chebyshev_esteig 0,.1,0,1.05

chebyshev is the default, as Barry suggested, replace this with gmres or
richardson (see below) and verify that this fixed the divergence problem.

If your matrix is symmetric positive definite then use
'-mg_levels_esteig_ksp_type cg', if not then use the default gmres.

Increase/decrease '-mg_levels_esteig_ksp_max_it 10', you should see the
estimates increase and converge with higher max_it. Setting this to a huge
number, like 100, should fix the bad seed vector problem mentioned above.

If eigen estimates are a pain, like with non SPD systems, then
richardson is an option (instead of chebyshev):

-mg_levels_ksp_type richardson
-mg_levels_ksp_richardson_scale 0.6

You then need to play with the scaling (that is what chebyshev does for you
essentially).


On Tue, Oct 25, 2016 at 10:22 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Oct 25, 2016 at 9:20 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>   Olivier,
>>
>>     Ok, so I've run the code in the debugger, but I don't not think the
>> problem is with the null space. The code is correctly removing the null
>> space on all the levels of multigrid.
>>
>>     I think the error comes from changes in the behavior of GAMG. GAMG is
>> relatively rapidly moving with different defaults and even different code
>> with each release.
>>
>>     To check this I added the option -poisson_mg_levels_pc_sor_lits 2 and
>> it stopped complaining about KSP_DIVERGED_INDEFINITE_PC. I've seen this
>> before where the smoother is "too weak" and so the net result is that
>> action of the preconditioner is indefinite. Mark Adams probably has better
>> suggestions on how to make the preconditioner behave. Note you could also
>> use a KSP of richardson or gmres instead of cg since they don't care about
>> this indefinite business.
>
>
> I think old GAMG squared the graph by default. You can see in the 3.7
> output that it does not.
>
>    Matt
>
>
>>
>>    Barry
>>
>>
>>
>> > On Oct 25, 2016, at 5:39 PM, Olivier Mesnard <
>> olivier.mesnard8 at gmail.com> wrote:
>> >
>> > On 25 October 2016 at 17:51, Barry Smith <bsmith at mcs.anl.gov> wrote:
>> >
>> >   Olivier,
>> >
>> >     In theory you do not need to change anything else. Are you using a
>> different matrix object for the velocity_ksp object than the poisson_ksp
>> object?
>> >
>> > ​The matrix is different for the velocity_ksp and the poisson_ksp​.
>> >
>> >     The code change in PETSc is very little but we have a report from
>> another CFD user who also had problems with the change so there may be some
>> subtle bug that we can't figure out causing things to not behave properly.
>> >
>> >    First run the 3.7.4 code with -poisson_ksp_view and verify that when
>> it prints the matrix information it prints something like has attached null
>> space if it does not print that it means that somehow the matrix is not
>> properly getting the matrix attached.
>> >
>> > ​When running with 3.7.4 and -poisson_ksp_view, the output shows that
>> the nullspace is not attached to the KSP (as it was with 3.5.4)​; however
>> the print statement is now under the Mat info (which is expected when
>> moving from KSPSetNullSpace to MatSetNullSpace?).
>> >
>> >     Though older versions had MatSetNullSpace() they didn't necessarily
>> associate it with the KSP so it was not expected to work as a replacement
>> for KSPSetNullSpace() with older versions.
>> >
>> >     Because our other user had great difficulty trying to debug the
>> issue feel free to send us at petsc-maint at mcs.anl.gov your code with
>> instructions on building and running and we can try to track down the
>> problem. Better than hours and hours spent with fruitless email. We will,
>> of course, not distribute the code and will delete in when we are finished
>> with it.
>> >
>> > ​The code is open-source and hosted on GitHub (
>> https://github.com/barbagroup/PetIBM)​.
>> > I just pushed the branches `feature-compatible-petsc-3.7` and
>> `revert-compatible-petsc-3.5` that I used to observe this problem.
>> >
>> > PETSc (both 3.5.4 and 3.7.4) was configured as follow:
>> > export PETSC_ARCH="linux-gnu-dbg"
>> > ./configure --PETSC_ARCH=$PETSC_ARCH \
>> >       --with-cc=gcc \
>> >       --with-cxx=g++ \
>> >       --with-fc=gfortran \
>> >       --COPTFLAGS="-O0" \
>> >       --CXXOPTFLAGS="-O0" \
>> >       --FOPTFLAGS="-O0" \
>> >       --with-debugging=1 \
>> >       --download-fblaslapack \
>> >       --download-mpich \
>> >       --download-hypre \
>> >       --download-yaml \
>> >       --with-x=1
>> >
>> > Our code was built using the following commands:​
>> > mkdir petibm-build
>> > cd petibm-build
>> > ​export PETSC_DIR=<directory of PETSc>
>> > export PETSC_ARCH="linux-gnu-dbg"
>> > export PETIBM_DIR=<directory of PetIBM git repo>
>> > $PETIBM_DIR/configure --prefix=$PWD \
>> >       CXX=$PETSC_DIR/$PETSC_ARCH/bin/mpicxx \
>> >       CXXFLAGS="-g -O0 -std=c++11"​
>> > make all
>> > make install
>> >
>> > ​Then
>> > cd examples
>> > make examples​
>> >
>> > ​The example of the lid-driven cavity I was talking about can be found
>> in the folder `examples/2d/convergence/lidDrivenCavity20/20/`​
>> >
>> > To run it:
>> > mpiexec -n N <path-to-petibm-build>/bin/petibm2d -directory
>> <path-to-example>
>> >
>> > Let me know if you need more info. Thank you.
>> >
>> >    Barry
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > > On Oct 25, 2016, at 4:38 PM, Olivier Mesnard <
>> olivier.mesnard8 at gmail.com> wrote:
>> > >
>> > > Hi all,
>> > >
>> > > We develop a CFD code using the PETSc library that solves the
>> Navier-Stokes equations using the fractional-step method from Perot (1993).
>> > > At each time-step, we solve two systems: one for the velocity field,
>> the other, a Poisson system, for the pressure field.
>> > > One of our test-cases is a 2D lid-driven cavity flow (Re=100) on a
>> 20x20 grid using 1 or 2 procs.
>> > > For the Poisson system, we usually use CG preconditioned with GAMG.
>> > >
>> > > So far, we have been using PETSc-3.5.4, and we would like to update
>> the code with the latest release: 3.7.4.
>> > >
>> > > As suggested in the changelog of 3.6, we replaced the routine
>> `KSPSetNullSpace()` with `MatSetNullSpace()`.
>> > >
>> > > Here is the list of options we use to configure the two solvers:
>> > > * Velocity solver: prefix `-velocity_`
>> > >   -velocity_ksp_type bcgs
>> > >   -velocity_ksp_rtol 1.0E-08
>> > >   -velocity_ksp_atol 0.0
>> > >   -velocity_ksp_max_it 10000
>> > >   -velocity_pc_type jacobi
>> > >   -velocity_ksp_view
>> > >   -velocity_ksp_monitor_true_residual
>> > >   -velocity_ksp_converged_reason
>> > > * Poisson solver: prefix `-poisson_`
>> > >   -poisson_ksp_type cg
>> > >   -poisson_ksp_rtol 1.0E-08
>> > >   -poisson_ksp_atol 0.0
>> > >   -poisson_ksp_max_it 20000
>> > >   -poisson_pc_type gamg
>> > >   -poisson_pc_gamg_type agg
>> > >   -poisson_pc_gamg_agg_nsmooths 1
>> > >   -poissonksp_view
>> > >   -poisson_ksp_monitor_true_residual
>> > >   -poisson_ksp_converged_reason
>> > >
>> > > With 3.5.4, the case runs normally on 1 or 2 procs.
>> > > With 3.7.4, the case runs normally on 1 proc but not on 2.
>> > > Why? The Poisson solver diverges because of an indefinite
>> preconditioner (only with 2 procs).
>> > >
>> > > We also saw that the routine `MatSetNullSpace()` was already
>> available in 3.5.4.
>> > > With 3.5.4, replacing `KSPSetNullSpace()` with `MatSetNullSpace()`
>> led to the Poisson solver diverging because of an indefinite matrix (on 1
>> and 2 procs).
>> > >
>> > > Thus, we were wondering if we needed to update something else for the
>> KSP, and not just modifying the name of the routine?
>> > >
>> > > I have attached the output files from the different cases:
>> > > * `run-petsc-3.5.4-n1.log` (3.5.4, `KSPSetNullSpace()`, n=1)
>> > > * `run-petsc-3.5.4-n2.log`
>> > > * `run-petsc-3.5.4-nsp-n1.log` (3.5.4, `MatSetNullSpace()`, n=1)
>> > > * `run-petsc-3.5.4-nsp-n2.log`
>> > > * `run-petsc-3.7.4-n1.log` (3.7.4, `MatSetNullSpace()`, n=1)
>> > > * `run-petsc-3.7.4-n2.log`
>> > >
>> > > Thank you for your help,
>> > > Olivier
>> > > <run-petsc-3.5.4-n1.log><run-petsc-3.5.4-n2.log><run-petsc-3
>> .5.4-nsp-n1.log><run-petsc-3.5.4-nsp-n2.log><run-petsc-3.7.4
>> -n1.log><run-petsc-3.7.4-n2.log>
>> >
>> >
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161026/3093555e/attachment.html>


More information about the petsc-users mailing list