[petsc-dev] -pc_asm_blocks

Matthew Knepley knepley at gmail.com
Mon Jan 18 07:53:51 CST 2010


On Mon, Jan 18, 2010 at 7:45 AM, Jed Brown <jed at 59a2.org> wrote:

> I thought this used to work:
>
> $ cd petsc/src/ksp/ksp/examples/tutorials/
> $ ./ex2 -pc_type asm -pc_asm_blocks 2
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Petsc has generated inconsistent data!
> [0]PETSC ERROR: Specified ASM subdomain sizes were invalid: 112 != 56!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Development HG revision:
> c48026146083ef288c8218540e6f61b678c1c226 HG Date: Sun Jan 17 17:16:57 2010
> +0100
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: ./ex2 on a ompi named kunyang by jed Mon Jan 18 14:39:15
> 2010
> [0]PETSC ERROR: Libraries linked from /home/jed/petsc/ompi/lib
> [0]PETSC ERROR: Configure run at Sun Jan 17 17:20:32 2010
> [0]PETSC ERROR: Configure options --with-zoltan-dir=/usr --download-ml
> --with-blas-lapack-dir=/usr --download-blacs --download-chaco
> --with-hdf5-dir=/usr --download-mumps --download-superlu_dist
> --download-spooles --download-sundials --download-hypre --with-c2html
> --with-mpi-dir=/usr --with-umfpack-dir=/usr --with-parmetis-dir=/usr
> --download-scalapack --with-lgrind --with-shared --with-sowing
> -PETSC_ARCH=ompi --download-superlu --download-spai
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: PCSetUp_ASM() line 239 in src/ksp/pc/impls/asm/asm.c
> [0]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: main() line 196 in src/ksp/ksp/examples/tutorials/ex2.c
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 77.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
> Looking at the loop that is generating this inconsistency, osm->is_local
> is NULL, so we take the second branch for each block:
>
>      if (osm->is_local) {
>        ...
>      } else {
>        ierr = VecGetLocalSize(vec,&m_local);CHKERRQ(ierr);
>        osm->y_local[i] = osm->y[i];
>        ierr = PetscObjectReference((PetscObject) osm->y[i]);CHKERRQ(ierr);
>        osm->prolongation[i] = osm->restriction[i];
>        ierr = PetscObjectReference((PetscObject)
> osm->restriction[i]);CHKERRQ(ierr);
>      }
>
> But vec is everything owned by that process so m_local is too big unless
> there is only one subdomain.  What was this code supposed to be doing?
>

I had to change ASM back in July? to get multiple overlapping blocks on a
given parallel subdomain.

Why?

  We used a particularly simple mechanism to implement RASM, namely dropping
off-process values.
That does not work when you have 2+ domains on 1 process.

  I needed this functionality for our RBF preconditioner

How?

  I added another IS that gave the local part of each block, since we
already know the blocks themselves.
It is taking the right branch here since no extra IS is specified.

What is wrong?

  I do not understand the comment. The local size should be right.

  Matt


> Jed
>
-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20100118/d07a06a6/attachment.html>


More information about the petsc-dev mailing list