[petsc-users] general VecScatter from MPI to MPI
Matthew Knepley
knepley at gmail.com
Thu Feb 17 11:22:46 CST 2011
On Thu, Feb 17, 2011 at 11:18 AM, Ethan Coon <ecoon at lanl.gov> wrote:
> On Thu, 2011-02-17 at 10:35 -0600, Matthew Knepley wrote:
> > On Thu, Feb 17, 2011 at 10:06 AM, Ethan Coon <ecoon at lanl.gov> wrote:
> > So I thought I understood how VecScatters worked, but
> > apparently not.
> > Is it possible to create a general VecScatter from an
> > arbitrarily
> > partitioned (MPI) Vec to another arbitrarily partitioned (MPI)
> > Vec with
> > the same global sizes (or same global IS sizes) but different
> > local
> > sizes? Shouldn't this just be a matter of relying upon the
> > implied
> > LocalToGlobalMapping?
> >
> >
> > No, the way you have to do this is to map a global Vec to a bunch of
> > sequential local Vecs with the sizes you want. This is also how we map
> > to overlapping arrays.
> >
>
> So effectively I need two scatters -- a scatter from the global Vec to
> the sequential local Vecs, then a scatter (which requires no
> communication) to inject the sequential Vecs into the new global Vec?
>
No, just wrap up the pieces of your global Vec as local Vecs and scatter
straight into that storage using VecCreateSeqWithArray().
Matt
> Why? Am I missing something that makes the MPI to MPI scatter ill-posed
> as long as the global sizes (but not local sizes) are equal?
>
> This is mostly curiosity on my part... I think I have to do two scatters
> anyway since I'm working with multiple comms -- scatter from an MPI Vec
> on one sub-comm into local, sequential Vecs, then scatter those
> sequential Vecs into an MPI Vec on PETSC_COMM_WORLD. That's the correct
> model for injecting an MPI Vec on one comm into an MPI Vec on
> PETSC_COMM_WORLD, correct?
>
> Ethan
>
> >
> > Matt
> >
> > See below snippet (and its errors):
> >
> > Ethan
> >
> >
> >
> > Vec vA
> > Vec vB
> > VecScatter scatter_AB
> >
> > PetscInt np
> > PetscInt rank
> > PetscErrorCode ierr
> >
> > if (rank.eq.0) np = 3
> > if (rank.eq.1) np = 1
> >
> > call VecCreateMPI(PETSC_COMM_WORLD, 2, PETSC_DETERMINE, vA,
> > ierr)
> > call VecCreateMPI(PETSC_COMM_WORLD, np, PETSC_DETERMINE, vB,
> > ierr)
> >
> > call VecScatterCreate(vA, PETSC_NULL_OBJECT, vB,
> > PETSC_NULL_OBJECT,
> > scatter_AB, ierr)
> >
> > ...
> >
> > $> mpiexec -n 2 ./test
> >
> > [0]PETSC ERROR: --------------------- Error Message
> > ------------------------------------
> > [0]PETSC ERROR: Nonconforming object sizes!
> > [0]PETSC ERROR: Local scatter sizes don't match!
> > [0]PETSC ERROR:
> >
> ------------------------------------------------------------------------
> > [1]PETSC ERROR: --------------------- Error Message
> > ------------------------------------
> > [1]PETSC ERROR: Nonconforming object sizes!
> > [1]PETSC ERROR: Local scatter sizes don't match!
> > [0]PETSC ERROR: Petsc Development HG revision:
> > 5dbe1264252fb9cb5d8e033d620d18f7b0e9111f HG Date: Fri Feb 11
> > 15:44:04
> > 2011 -0600
> > [0]PETSC ERROR: See docs/changes/index.html for recent
> > updates.
> > [0]PETSC ERROR: See docs/faq.html for hints about trouble
> > shooting.
> > [0]PETSC ERROR: See docs/index.html for manual pages.
> > [0]PETSC ERROR:
> >
> ------------------------------------------------------------------------
> > [0]PETSC ERROR: ./test on a linux-gnu named tama1 by ecoon Thu
> > Feb 17
> > 08:14:57 2011
> > [0]PETSC ERROR: Libraries linked
> > from
> /packages/petsc/petsc-dev3.0-mpich2-local-gcc-4.3.3/debug-shared/lib
> > [0]PETSC ERROR: Configure run at Fri Feb 11 16:15:14 2011
> > [0]PETSC ERROR: Configure options --with-debugging=1
> >
> --prefix=/packages/petsc/petsc-dev3.0-mpich2-local-gcc-4.3.3/debug-shared
> --download-mpich=1 --download-ml=1 --download-umfpack=1
> --with-blas-lapack-dir=/usr/lib --download-parmetis=yes
> PETSC_ARCH=linux-gnu-c-debug-shared --with-clanguage=c --download-hypre=1
> --with-shared-libraries=1 --download-hdf5=1
> > [0]PETSC ERROR:
> >
> ------------------------------------------------------------------------
> > [0]PETSC ERROR: VecScatterCreate() line 1432 in
> > src/vec/vec/utils/vscat.c
> > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0
> > [cli_0]: aborting job:
> > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0
> > [1]PETSC ERROR: APPLICATION TERMINATED WITH THE EXIT STRING:
> > Hangup
> > (signal 1)
> >
> >
> >
> >
> > --
> > ------------------------------------
> > Ethan Coon
> > Post-Doctoral Researcher
> > Applied Mathematics - T-5
> > Los Alamos National Laboratory
> > 505-665-8289
> >
> > http://www.ldeo.columbia.edu/~ecoon/
> > ------------------------------------
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
>
> --
> ------------------------------------
> Ethan Coon
> Post-Doctoral Researcher
> Applied Mathematics - T-5
> Los Alamos National Laboratory
> 505-665-8289
>
> http://www.ldeo.columbia.edu/~ecoon/
> ------------------------------------
>
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110217/64f4c015/attachment.htm>
More information about the petsc-users
mailing list