[petsc-users] petsc error RE pc_mg_levels usage

Matthew Knepley knepley at gmail.com
Wed Oct 7 19:20:47 CDT 2015


On Wed, Oct 7, 2015 at 6:04 PM, Boris <borisbou at buffalo.edu> wrote:

> Hello,
>
> I am working on a toy Laplace problem (p1.C attached) investigating some
> multigrid projection matrices when using finite differences. Things seem to
> work as expected when running with :
>
> ./p1 -da_grid_x 21 -da_grid_y 21 -mat_view -pc_type mg -pc_mg_levels 1
>
> But if I change pc_mg_levels to 2 or 3, I generate the error below.
>
> Any insights as to what I could be doing wrong would be much appreciated.
>

It looks like your FormJacobian() is getting called with a DM that you do
not expect. I think its getting
called for a coarse grid, but you are getting the same fine DM from your
user context.

Take a look at SNES ex5 where we solve exactly this problem.

  Thanks,

    Matt


> Thanks,
> Boris
>
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: Local index 169 too large 168 (max) at 4
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.5.4, May, 23, 2015
> [0]PETSC ERROR: ./p1 on a gcc-4.8.4-mpich-3.1.4-openblas-0.2.14-opt named
> bender.eng.buffalo.edu by borisbou Wed Oct  7 18:06:35 2015
> [0]PETSC ERROR: Configure options
> --prefix=/bender1/data/shared/software/libs/petsc/3.5.4/gcc/4.8.4/mpich/3.1.4/openblas/0.2.14/opt
> --with-debugging=false --COPTFLAGS="-O3 -mavx" --CXXOPTFLAGS="-O3 -mavx"
> --FOPTFLAGS=-O3 --with-shared-libraries=1
> --with-mpi-dir=/bender1/data/shared/software/libs/mpich/3.1.4/gcc/4.8.4
> --with-mumps=true --download-mumps=1 --with-metis=true --download-metis=1
> --with-parmetis=true --download-parmetis=1 --with-superlu=true
> --download-superlu=1 --with-superludir=true --download-superlu_dist=1
> --with-blacs=true --download-blacs=1 --with-scalapack=true
> --download-scalapack=1 --with-hypre=true --download-hypre=1
> --with-blas-lib="[/bender1/data/shared/software/libs/openblas/0.2.14/gcc/4.8.4/lib/libopenblas.so]"
> --with-lapack-lib="[/bender1/data/shared/software/libs/openblas/0.2.14/gcc/4.8.4/lib/libopenblas.so]"
> [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 401 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/vec/is/utils/isltog.c
> [0]PETSC ERROR: #2 MatSetValuesLocal() line 1982 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/mat/interface/matrix.c
> [0]PETSC ERROR: #3 MatSetValuesStencil() line 1383 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/mat/interface/matrix.c
> [0]PETSC ERROR: #4 FormJacobian() line 366 in p1.C
> [0]PETSC ERROR: #5 SNESComputeJacobian() line 2193 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/snes/interface/snes.c
> [0]PETSC ERROR: #6 KSPComputeOperators_SNES() line 527 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/snes/interface/snes.c
> [0]PETSC ERROR: #7 KSPSetUp() line 256 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #8 PCSetUp_MG() line 803 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: #9 PCSetUp() line 902 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #10 KSPSetUp() line 306 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #11 KSPSolve() line 418 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #12 SNESSolve_NEWTONLS() line 232 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: #13 SNESSolve() line 3743 in
> /bender1/data/shared/software/builddir/petsc-1HCwjW/petsc-3.5.4/src/snes/interface/snes.c
> [0]PETSC ERROR: #14 main() line 147 in p1.C
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151007/d7d2cbb7/attachment.html>


More information about the petsc-users mailing list