[petsc-users] Moving from KSPSetNullSpace to MatSetNullSpace

Matthew Knepley knepley at gmail.com
Tue Oct 25 20:28:50 CDT 2016


On Tue, Oct 25, 2016 at 7:57 PM, Olivier Mesnard <olivier.mesnard8 at gmail.com
> wrote:

> On 25 October 2016 at 19:59, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Tue, Oct 25, 2016 at 6:22 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>> > On Oct 25, 2016, at 5:39 PM, Olivier Mesnard <
>>> olivier.mesnard8 at gmail.com> wrote:
>>> >
>>> > On 25 October 2016 at 17:51, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>> >
>>> >   Olivier,
>>> >
>>> >     In theory you do not need to change anything else. Are you using a
>>> different matrix object for the velocity_ksp object than the poisson_ksp
>>> object?
>>> >
>>> > ​The matrix is different for the velocity_ksp and the poisson_ksp​.
>>> >
>>> >     The code change in PETSc is very little but we have a report from
>>> another CFD user who also had problems with the change so there may be some
>>> subtle bug that we can't figure out causing things to not behave properly.
>>> >
>>> >    First run the 3.7.4 code with -poisson_ksp_view and verify that
>>> when it prints the matrix information it prints something like has attached
>>> null space if it does not print that it means that somehow the matrix is
>>> not properly getting the matrix attached.
>>> >
>>> > ​When running with 3.7.4 and -poisson_ksp_view, the output shows that
>>> the nullspace is not attached to the KSP (as it was with 3.5.4)​; however
>>> the print statement is now under the Mat info (which is expected when
>>> moving from KSPSetNullSpace to MatSetNullSpace?).
>>>
>>>    Good, this is how it should be.
>>> >
>>> >     Though older versions had MatSetNullSpace() they didn't
>>> necessarily associate it with the KSP so it was not expected to work as a
>>> replacement for KSPSetNullSpace() with older versions.
>>> >
>>> >     Because our other user had great difficulty trying to debug the
>>> issue feel free to send us at petsc-maint at mcs.anl.gov your code with
>>> instructions on building and running and we can try to track down the
>>> problem. Better than hours and hours spent with fruitless email. We will,
>>> of course, not distribute the code and will delete in when we are finished
>>> with it.
>>> >
>>> > ​The code is open-source and hosted on GitHub (
>>> https://github.com/barbagroup/PetIBM)​.
>>> > I just pushed the branches `feature-compatible-petsc-3.7` and
>>> `revert-compatible-petsc-3.5` that I used to observe this problem.
>>> >
>>>     Thanks, I'll get back to you if I discover anything
>>
>>
>> Obviously GAMG is behaving quite differently (1 vs 2 levels and a much
>> sparser coarse problem in 3.7).
>>
>> Could you try one thing for me before we start running it? Run with
>>
>>   -poisson_mg_coarse_sub_pc_type svd
>>
>> and see what happens on 2 procs for 3.7?
>>
>> ​Hi Matt,
>
> With
> -poisson_mg_coarse_sub_pc_type svd
> ​,​
> it ran normally on 1 proc but not on 2 procs the end of the output says:
>
> "** On entry to DGESVD parameter number  6 had an illegal value
> "
>

Something is wrong with your 3.7 installation. That parameter is a simple
size, so I suspect
memory corruption. Run with valgrind.

  Thanks,

     Matt


> I attached the log file.
>>
>
>>   Thanks,
>>
>>      Matt
>>
>>
>>>
>>> > PETSc (both 3.5.4 and 3.7.4) was configured as follow:
>>> > export PETSC_ARCH="linux-gnu-dbg"
>>> > ./configure --PETSC_ARCH=$PETSC_ARCH \
>>> >       --with-cc=gcc \
>>> >       --with-cxx=g++ \
>>> >       --with-fc=gfortran \
>>> >       --COPTFLAGS="-O0" \
>>> >       --CXXOPTFLAGS="-O0" \
>>> >       --FOPTFLAGS="-O0" \
>>> >       --with-debugging=1 \
>>> >       --download-fblaslapack \
>>> >       --download-mpich \
>>> >       --download-hypre \
>>> >       --download-yaml \
>>> >       --with-x=1
>>> >
>>> > Our code was built using the following commands:​
>>> > mkdir petibm-build
>>> > cd petibm-build
>>> > ​export PETSC_DIR=<directory of PETSc>
>>> > export PETSC_ARCH="linux-gnu-dbg"
>>> > export PETIBM_DIR=<directory of PetIBM git repo>
>>> > $PETIBM_DIR/configure --prefix=$PWD \
>>> >       CXX=$PETSC_DIR/$PETSC_ARCH/bin/mpicxx \
>>> >       CXXFLAGS="-g -O0 -std=c++11"​
>>> > make all
>>> > make install
>>> >
>>> > ​Then
>>> > cd examples
>>> > make examples​
>>> >
>>> > ​The example of the lid-driven cavity I was talking about can be found
>>> in the folder `examples/2d/convergence/lidDrivenCavity20/20/`​
>>> >
>>> > To run it:
>>> > mpiexec -n N <path-to-petibm-build>/bin/petibm2d -directory
>>> <path-to-example>
>>> >
>>> > Let me know if you need more info. Thank you.
>>> >
>>> >    Barry
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > > On Oct 25, 2016, at 4:38 PM, Olivier Mesnard <
>>> olivier.mesnard8 at gmail.com> wrote:
>>> > >
>>> > > Hi all,
>>> > >
>>> > > We develop a CFD code using the PETSc library that solves the
>>> Navier-Stokes equations using the fractional-step method from Perot (1993).
>>> > > At each time-step, we solve two systems: one for the velocity field,
>>> the other, a Poisson system, for the pressure field.
>>> > > One of our test-cases is a 2D lid-driven cavity flow (Re=100) on a
>>> 20x20 grid using 1 or 2 procs.
>>> > > For the Poisson system, we usually use CG preconditioned with GAMG.
>>> > >
>>> > > So far, we have been using PETSc-3.5.4, and we would like to update
>>> the code with the latest release: 3.7.4.
>>> > >
>>> > > As suggested in the changelog of 3.6, we replaced the routine
>>> `KSPSetNullSpace()` with `MatSetNullSpace()`.
>>> > >
>>> > > Here is the list of options we use to configure the two solvers:
>>> > > * Velocity solver: prefix `-velocity_`
>>> > >   -velocity_ksp_type bcgs
>>> > >   -velocity_ksp_rtol 1.0E-08
>>> > >   -velocity_ksp_atol 0.0
>>> > >   -velocity_ksp_max_it 10000
>>> > >   -velocity_pc_type jacobi
>>> > >   -velocity_ksp_view
>>> > >   -velocity_ksp_monitor_true_residual
>>> > >   -velocity_ksp_converged_reason
>>> > > * Poisson solver: prefix `-poisson_`
>>> > >   -poisson_ksp_type cg
>>> > >   -poisson_ksp_rtol 1.0E-08
>>> > >   -poisson_ksp_atol 0.0
>>> > >   -poisson_ksp_max_it 20000
>>> > >   -poisson_pc_type gamg
>>> > >   -poisson_pc_gamg_type agg
>>> > >   -poisson_pc_gamg_agg_nsmooths 1
>>> > >   -poissonksp_view
>>> > >   -poisson_ksp_monitor_true_residual
>>> > >   -poisson_ksp_converged_reason
>>> > >
>>> > > With 3.5.4, the case runs normally on 1 or 2 procs.
>>> > > With 3.7.4, the case runs normally on 1 proc but not on 2.
>>> > > Why? The Poisson solver diverges because of an indefinite
>>> preconditioner (only with 2 procs).
>>> > >
>>> > > We also saw that the routine `MatSetNullSpace()` was already
>>> available in 3.5.4.
>>> > > With 3.5.4, replacing `KSPSetNullSpace()` with `MatSetNullSpace()`
>>> led to the Poisson solver diverging because of an indefinite matrix (on 1
>>> and 2 procs).
>>> > >
>>> > > Thus, we were wondering if we needed to update something else for
>>> the KSP, and not just modifying the name of the routine?
>>> > >
>>> > > I have attached the output files from the different cases:
>>> > > * `run-petsc-3.5.4-n1.log` (3.5.4, `KSPSetNullSpace()`, n=1)
>>> > > * `run-petsc-3.5.4-n2.log`
>>> > > * `run-petsc-3.5.4-nsp-n1.log` (3.5.4, `MatSetNullSpace()`, n=1)
>>> > > * `run-petsc-3.5.4-nsp-n2.log`
>>> > > * `run-petsc-3.7.4-n1.log` (3.7.4, `MatSetNullSpace()`, n=1)
>>> > > * `run-petsc-3.7.4-n2.log`
>>> > >
>>> > > Thank you for your help,
>>> > > Olivier
>>> > > <run-petsc-3.5.4-n1.log><run-petsc-3.5.4-n2.log><run-petsc-3
>>> .5.4-nsp-n1.log><run-petsc-3.5.4-nsp-n2.log><run-petsc-3.7.4
>>> -n1.log><run-petsc-3.7.4-n2.log>
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161025/174b4cf1/attachment-0001.html>


More information about the petsc-users mailing list