On Thu, Feb 17, 2011 at 11:18 AM, Ethan Coon <span dir="ltr"><<a href="mailto:ecoon@lanl.gov">ecoon@lanl.gov</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">On Thu, 2011-02-17 at 10:35 -0600, Matthew Knepley wrote:<br>
> On Thu, Feb 17, 2011 at 10:06 AM, Ethan Coon <<a href="mailto:ecoon@lanl.gov">ecoon@lanl.gov</a>> wrote:<br>
> So I thought I understood how VecScatters worked, but<br>
> apparently not.<br>
> Is it possible to create a general VecScatter from an<br>
> arbitrarily<br>
> partitioned (MPI) Vec to another arbitrarily partitioned (MPI)<br>
> Vec with<br>
> the same global sizes (or same global IS sizes) but different<br>
> local<br>
> sizes? Shouldn't this just be a matter of relying upon the<br>
> implied<br>
> LocalToGlobalMapping?<br>
><br>
><br>
> No, the way you have to do this is to map a global Vec to a bunch of<br>
> sequential local Vecs with the sizes you want. This is also how we map<br>
> to overlapping arrays.<br>
><br>
<br>
</div>So effectively I need two scatters -- a scatter from the global Vec to<br>
the sequential local Vecs, then a scatter (which requires no<br>
communication) to inject the sequential Vecs into the new global Vec?<br></blockquote><div><br></div><div>No, just wrap up the pieces of your global Vec as local Vecs and scatter</div><div>straight into that storage using VecCreateSeqWithArray().</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Why? Am I missing something that makes the MPI to MPI scatter ill-posed<br>
as long as the global sizes (but not local sizes) are equal?<br>
<br>
This is mostly curiosity on my part... I think I have to do two scatters<br>
anyway since I'm working with multiple comms -- scatter from an MPI Vec<br>
on one sub-comm into local, sequential Vecs, then scatter those<br>
sequential Vecs into an MPI Vec on PETSC_COMM_WORLD. That's the correct<br>
model for injecting an MPI Vec on one comm into an MPI Vec on<br>
PETSC_COMM_WORLD, correct?<br>
<font color="#888888"><br>
Ethan<br>
</font><div><div></div><div class="h5"><br>
><br>
> Matt<br>
><br>
> See below snippet (and its errors):<br>
><br>
> Ethan<br>
><br>
><br>
><br>
> Vec vA<br>
> Vec vB<br>
> VecScatter scatter_AB<br>
><br>
> PetscInt np<br>
> PetscInt rank<br>
> PetscErrorCode ierr<br>
><br>
> if (rank.eq.0) np = 3<br>
> if (rank.eq.1) np = 1<br>
><br>
> call VecCreateMPI(PETSC_COMM_WORLD, 2, PETSC_DETERMINE, vA,<br>
> ierr)<br>
> call VecCreateMPI(PETSC_COMM_WORLD, np, PETSC_DETERMINE, vB,<br>
> ierr)<br>
><br>
> call VecScatterCreate(vA, PETSC_NULL_OBJECT, vB,<br>
> PETSC_NULL_OBJECT,<br>
> scatter_AB, ierr)<br>
><br>
> ...<br>
><br>
> $> mpiexec -n 2 ./test<br>
><br>
> [0]PETSC ERROR: --------------------- Error Message<br>
> ------------------------------------<br>
> [0]PETSC ERROR: Nonconforming object sizes!<br>
> [0]PETSC ERROR: Local scatter sizes don't match!<br>
> [0]PETSC ERROR:<br>
> ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: --------------------- Error Message<br>
> ------------------------------------<br>
> [1]PETSC ERROR: Nonconforming object sizes!<br>
> [1]PETSC ERROR: Local scatter sizes don't match!<br>
> [0]PETSC ERROR: Petsc Development HG revision:<br>
> 5dbe1264252fb9cb5d8e033d620d18f7b0e9111f HG Date: Fri Feb 11<br>
> 15:44:04<br>
> 2011 -0600<br>
> [0]PETSC ERROR: See docs/changes/index.html for recent<br>
> updates.<br>
> [0]PETSC ERROR: See docs/faq.html for hints about trouble<br>
> shooting.<br>
> [0]PETSC ERROR: See docs/index.html for manual pages.<br>
> [0]PETSC ERROR:<br>
> ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: ./test on a linux-gnu named tama1 by ecoon Thu<br>
> Feb 17<br>
> 08:14:57 2011<br>
> [0]PETSC ERROR: Libraries linked<br>
> from /packages/petsc/petsc-dev3.0-mpich2-local-gcc-4.3.3/debug-shared/lib<br>
> [0]PETSC ERROR: Configure run at Fri Feb 11 16:15:14 2011<br>
> [0]PETSC ERROR: Configure options --with-debugging=1<br>
> --prefix=/packages/petsc/petsc-dev3.0-mpich2-local-gcc-4.3.3/debug-shared --download-mpich=1 --download-ml=1 --download-umfpack=1 --with-blas-lapack-dir=/usr/lib --download-parmetis=yes PETSC_ARCH=linux-gnu-c-debug-shared --with-clanguage=c --download-hypre=1 --with-shared-libraries=1 --download-hdf5=1<br>
> [0]PETSC ERROR:<br>
> ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: VecScatterCreate() line 1432 in<br>
> src/vec/vec/utils/vscat.c<br>
> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0<br>
> [cli_0]: aborting job:<br>
> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0<br>
> [1]PETSC ERROR: APPLICATION TERMINATED WITH THE EXIT STRING:<br>
> Hangup<br>
> (signal 1)<br>
><br>
><br>
><br>
><br>
> --<br>
> ------------------------------------<br>
> Ethan Coon<br>
> Post-Doctoral Researcher<br>
> Applied Mathematics - T-5<br>
> Los Alamos National Laboratory<br>
> 505-665-8289<br>
><br>
> <a href="http://www.ldeo.columbia.edu/~ecoon/" target="_blank">http://www.ldeo.columbia.edu/~ecoon/</a><br>
> ------------------------------------<br>
><br>
><br>
><br>
><br>
> --<br>
> What most experimenters take for granted before they begin their<br>
> experiments is infinitely more interesting than any results to which<br>
> their experiments lead.<br>
> -- Norbert Wiener<br>
<br>
</div></div>--<br>
<div><div></div><div class="h5">------------------------------------<br>
Ethan Coon<br>
Post-Doctoral Researcher<br>
Applied Mathematics - T-5<br>
Los Alamos National Laboratory<br>
505-665-8289<br>
<br>
<a href="http://www.ldeo.columbia.edu/~ecoon/" target="_blank">http://www.ldeo.columbia.edu/~ecoon/</a><br>
------------------------------------<br>
<br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>