[petsc-users] Problem with TS and PCMG

Torquil Macdonald Sørensen torquil at gmail.com
Sat Dec 14 07:46:15 CST 2013


Hi!

I'm getting segfault problems when trying to use TSStep or TSSolve with
TSBEULER and TSLINEAR, together with PCMG. So I have reduced my code to
a minimal (and mathematically trivial) test case which displays the same
error, and I'm hoping someone here will spot any errors that I've made
in my setup of TS.

I can use my PCMG restriction and interpolation matrices with success in
a single linear solve with KSPSolve (and can see that MG is being used
by adding -ksp_view), but for some reason I am not able to get PCMG
working with TSStep. So I'm wondering if I'm forgetting a step when when
I'm setting up the TS?

My test case tries to solve Id*dU/dt = Id*U, where Id is a 5x5 identity
matrix, and U is a 5-vector unknown, with initial condition U = 0. The
PCMG construction uses an additional coarse grid of 3 points. The fine
grid vertices are labelled 0, 1, 2, 3, 4, while the coarse grid is
labelled 0, 1, 2, The coarse grid vertices coincide with the fine grid
vertices 0, 2 and 4. The resulting interpolation and restriction
matrices arise as interpolations matrices between the FEM spaces defined
using piecewise linear shape functions on these grids.

The GDB message I'm getting is:

Program received signal SIGSEGV, Segmentation fault.
0x0000000000000000 in ?? ()
(gdb) where
#0  0x0000000000000000 in ?? ()
#1  0x00007fa0e6bbd172 in DMCoarsen (dm=0x28e9170, comm=0x7fa0e638d9c0
<ompi_mpi_comm_null>, dmc=0x295a490) at dm.c:1811
#2  0x00007fa0e6ff1c1e in PCSetUp_MG (pc=0x28d71a0) at mg.c:593
#3  0x00007fa0e72eb6ce in PCSetUp (pc=0x28d71a0) at precon.c:890
#4  0x00007fa0e6e6f7e8 in KSPSetUp (ksp=0x28e1a50) at itfunc.c:278
#5  0x00007fa0e6e70a30 in KSPSolve (ksp=0x28e1a50, b=0x297bea0,
x=0x2981970) at itfunc.c:399
#6  0x00007fa0e6e8ae0c in SNESSolve_KSPONLY (snes=0x28c7710) at ksponly.c:44
#7  0x00007fa0e757b642 in SNESSolve (snes=0x28c7710, b=0x0, x=0x296c7d0)
at snes.c:3636
#8  0x00007fa0e76280cc in TSStep_Theta (ts=0x2894220) at theta.c:183
#9  0x00007fa0e766a202 in TSStep (ts=0x2894220) at ts.c:2507
#10 0x0000000000403363 in main (argc=3, argv=0x7fff08015378) at
/home/tmac/research/schrodinger/source/testcase.cpp:97

My testcase code is:

#include "petscts.h"

int main(int argc, char ** argv)
{
    PetscErrorCode ierr = PetscInitialize(&argc, &argv, 0, 0);
CHKERRQ(ierr);

    int const nb_dof_fine = 5;
    int const nb_dof_coarse = 3;

    Vec x; // A vector to hold the solution on the fine grid
    ierr = VecCreateSeq(PETSC_COMM_SELF, nb_dof_fine, &x); CHKERRQ(ierr);

    Mat Id; // Identity matrix on the vector space of the fine grid
    ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, nb_dof_fine, nb_dof_fine, 1,
0, &Id); CHKERRQ(ierr);
    for(int i = 0; i != nb_dof_fine; i++) { MatSetValue(Id, i, i, 1.0,
INSERT_VALUES); }
    ierr = MatAssemblyBegin(Id, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatAssemblyEnd(Id, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatView(Id, PETSC_VIEWER_STDOUT_SELF);

    Mat interp; // Multigrid interpolation matrix
    ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, nb_dof_fine, nb_dof_coarse,
2, 0, &interp); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 0, 0, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 1, 0, 0.5, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 1, 1, 0.5, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 2, 1, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 3, 1, 0.5, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 3, 2, 0.5, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(interp, 4, 2, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatAssemblyBegin(interp, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatAssemblyEnd(interp, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatView(interp, PETSC_VIEWER_STDOUT_SELF);

    Mat restrict; // Multigrid restriction matrix, not the transpose of
"interp"
    ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, nb_dof_coarse, nb_dof_fine,
1, 0, &restrict); CHKERRQ(ierr);
    ierr = MatSetValue(restrict, 0, 0, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(restrict, 1, 2, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatSetValue(restrict, 2, 4, 1.0, INSERT_VALUES); CHKERRQ(ierr);
    ierr = MatAssemblyBegin(restrict, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatAssemblyEnd(restrict, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatView(restrict, PETSC_VIEWER_STDOUT_SELF);

    {    // Solving the linear problem Id*x = 0 works
        KSP ksp;
        ierr = KSPCreate(PETSC_COMM_SELF, &ksp); CHKERRQ(ierr);
        ierr = KSPSetType(ksp, KSPFGMRES); CHKERRQ(ierr);
        ierr = KSPSetOperators(ksp, Id, Id, SAME_PRECONDITIONER);
CHKERRQ(ierr);

        PC pc;
        ierr = KSPGetPC(ksp, &pc); CHKERRQ(ierr);
        ierr = PCSetType(pc, PCMG); CHKERRQ(ierr);
        ierr = PCMGSetLevels(pc, 2, 0); CHKERRQ(ierr);
        ierr = PCMGSetGalerkin(pc, PETSC_TRUE); CHKERRQ(ierr);
        ierr = PCMGSetInterpolation(pc, 1, interp); CHKERRQ(ierr);
        ierr = PCMGSetRestriction(pc, 1, restrict); CHKERRQ(ierr);

        ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr);

        Vec b; // Trivial RHS for the linear problem
        ierr = VecDuplicate(x, &b); CHKERRQ(ierr);
        ierr = VecSet(b, 0); CHKERRQ(ierr);
        ierr = KSPSolve(ksp, b, x); CHKERRQ(ierr);
        ierr = VecView(x, PETSC_VIEWER_STDOUT_SELF);

        ierr = VecDestroy(&b); CHKERRQ(ierr);
        ierr = KSPDestroy(&ksp); CHKERRQ(ierr);

        ierr = PetscPrintf(PETSC_COMM_SELF, "KSPSolve worked\n");
CHKERRQ(ierr);
    }

    {    // This block of code causes a segfault
        TS ts;
        ierr = TSCreate(PETSC_COMM_SELF, &ts); CHKERRQ(ierr);
        ierr = TSSetProblemType(ts, TS_LINEAR); CHKERRQ(ierr);
        ierr = TSSetType(ts, TSBEULER); // Use an implicit scheme
        ierr = TSSetInitialTimeStep(ts, 0.0, 0.1); CHKERRQ(ierr);
        ierr = TSSetSolution(ts, x); CHKERRQ(ierr);
        ierr = TSSetRHSFunction(ts, 0, TSComputeRHSFunctionLinear, 0);
CHKERRQ(ierr);
        ierr = TSSetRHSJacobian(ts, Id, Id,
TSComputeRHSJacobianConstant, 0); CHKERRQ(ierr);
        ierr = TSSetIFunction(ts, 0, TSComputeIFunctionLinear, 0);
CHKERRQ(ierr);
        ierr = TSSetIJacobian(ts, Id, Id, TSComputeIJacobianConstant, 0);

        KSP ksp;
        ierr = TSGetKSP(ts, &ksp); CHKERRQ(ierr);
        ierr = KSPSetType(ksp, KSPFGMRES); CHKERRQ(ierr);

        PC pc;
        ierr = KSPGetPC(ksp, &pc); CHKERRQ(ierr);
        ierr = PCSetType(pc, PCMG); CHKERRQ(ierr);
        ierr = PCMGSetLevels(pc, 2, 0); CHKERRQ(ierr);
        ierr = PCMGSetGalerkin(pc, PETSC_TRUE); CHKERRQ(ierr);
        ierr = PCMGSetInterpolation(pc, 1, interp); CHKERRQ(ierr);
        ierr = PCMGSetRestriction(pc, 1, restrict); CHKERRQ(ierr);

        ierr = TSSetUp(ts); CHKERRQ(ierr);
        ierr = TSStep(ts); CHKERRQ(ierr);

        ierr = TSDestroy(&ts); CHKERRQ(ierr);
    }

    ierr = MatDestroy(&restrict); CHKERRQ(ierr);
    ierr = MatDestroy(&interp); CHKERRQ(ierr);
    ierr = VecDestroy(&x); CHKERRQ(ierr);
    ierr = MatDestroy(&Id); CHKERRQ(ierr);
    ierr = PetscFinalize(); CHKERRQ(ierr);

    return 0;
}

The actual output of the program when running without a debugger is:

Matrix Object: 1 MPI processes
  type: seqaij
row 0: (0, 1)
row 1: (1, 1)
row 2: (2, 1)
row 3: (3, 1)
row 4: (4, 1)
Matrix Object: 1 MPI processes
  type: seqaij
row 0: (0, 1)
row 1: (0, 0.5)  (1, 0.5)
row 2: (1, 1)
row 3: (1, 0.5)  (2, 0.5)
row 4: (2, 1)
Matrix Object: 1 MPI processes
  type: seqaij
row 0: (0, 1)
row 1: (2, 1)
row 2: (4, 1)
Vector Object: 1 MPI processes
  type: seq
0
0
0
0
0
KSPSolve worked
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] DMCoarsen line 1809 src/dm/interface/dm.c
[0]PETSC ERROR: [0] PCSetUp_MG line 546 src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCSetUp line 868 src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 192 src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 356 src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] SNESSolve_KSPONLY line 14
src/snes/impls/ksponly/ksponly.c
[0]PETSC ERROR: [0] SNESSolve line 3589 src/snes/interface/snes.c
[0]PETSC ERROR: [0] TSStep_Theta line 162
src/ts/impls/implicit/theta/theta.c
[0]PETSC ERROR: [0] TSStep line 2499 src/ts/interface/ts.c
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./testcase.elf on a linux-gnu-c-debug named lenovo by
tmac Sat Dec 14 14:35:56 2013
[0]PETSC ERROR: Libraries linked from /tmp/petsc-3.4.3/linux-gnu-c-debug/lib
[0]PETSC ERROR: Configure run at Fri Nov 22 11:07:41 2013
[0]PETSC ERROR: Configure options --with-shared-libraries
--with-debugging=1 --useThreads 0 --with-clanguage=C++ --with-c-support
--with-fortran-interfaces=1 --with-scalar-type=complex
--with-mpi-dir=/usr/lib/openmpi --with-blas-lib=-lblas
--with-lapack-lib=-llapack --with-blacs=1
--with-blacs-include=/usr/include
--with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]"
--with-scalapack=1 --with-scalapack-include=/usr/include
--with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1
--with-mumps-include=/usr/include
--with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]"
--with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse
--with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
--with-cholmod=1 --with-cholmod-include=/usr/include/suitesparse
--with-cholmod-lib=/usr/lib/libcholmod.so --with-spooles=1
--with-spooles-include=/usr/include/spooles
--with-spooles-lib=/usr/lib/libspooles.so --with-ptscotch=1
--with-ptscotch-include=/usr/include/scotch
--with-ptscotch-lib="[/usr/lib/libptesmumps.so,/usr/lib/libptscotch.so,/usr/lib/libptscotcherr.so]"
--with-fftw=1 --with-fftw-include=/usr/include
--with-fftw-lib="[/usr/lib/x86_64-linux-gnu/libfftw3.so,/usr/lib/x86_64-linux-gnu/libfftw3_mpi.so]"
--with-superlu=1 --with-superlu-include=/usr/include/superlu
--with-superlu-lib=/usr/lib/libsuperlu.so
--CXX_LINKER_FLAGS=-Wl,--no-as-needed
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory
unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Best regards
Torquil Sørensen



More information about the petsc-users mailing list