[petsc-users] question about VecScatter from one global vector to another

Randall Mackie rlmackie862 at gmail.com
Fri Feb 19 19:00:20 CST 2016


Yes, it makes perfect sense. 

The issue is that the global size of my ISs are the same, but it’s the local size that is different (because of different DMDAs)

Is there an easy way to put both IS’s on the same parallel layout. ISCopy doesn’t work in that case.

Randy
 
> On Feb 19, 2016, at 4:39 PM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Fri, Feb 19, 2016 at 6:33 PM, Randall Mackie <rlmackie862 at gmail.com <mailto:rlmackie862 at gmail.com>> wrote:
> I am trying to do a VecScatter of a subset of elements from a global vector on one DMDA to a global vector on a different DMDA (different sized DMDAs).
> 
> I thought what made sense was to create a parallel IS using the local to global mapping obtained from the two DMDAs so that the local portion of each IS contained only the values on that processor.
> 
> This works fine as long as the local size of the two IS’s are the same.
> 
> But, say I have a situation like this:
> 
> DMDA 1:
>   proc 1: 1 value to scatter
>   proc 2: 7 values to scatter
> 
> 
> DMDA 2:
>   proc 1: 4 values to get
>   proc 2: 4 values to get
> 
> This doesn’t work because VecScatterCreate says the local size of the IS’s must be the same.
> 
> Is the only way to set up this VecScatter to create a sequential IS on each processor of length 8 and duplicate the values?
> 
> The two inputs are
> 
>   index of input vector --> input of output vector
> 
> and you can have as many entries of this map on any process as you want. Clearly, you
> need the same number of entires in each IS to make this map meaningful. Does that make
> sense?
> 
>   Thanks,
> 
>     Matt
>  
> Thanks,
> 
> Randy M.
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160219/603d076e/attachment.html>


More information about the petsc-users mailing list