[petsc-users] Repacking/scattering of multi-dimensional arrays
Matthew Knepley
knepley at gmail.com
Mon Mar 26 14:18:36 CDT 2012
On Mon, Mar 26, 2012 at 2:15 PM, Dag Sverre Seljebotn <
d.s.seljebotn at astro.uio.no> wrote:
> On 03/26/2012 12:06 PM, Jed Brown wrote:
>
>> Matt, Jack, and myself have discussed adding proper support for
>> Elemental with PETSc, but we just haven't had the time. Matt and I don't
>> have a direct use case and we all have plenty of other things going.
>>
>> On Mon, Mar 26, 2012 at 13:53, Dag Sverre Seljebotn
>> <d.s.seljebotn at astro.uio.no <mailto:d.s.seljebotn at astro.**uio.no<d.s.seljebotn at astro.uio.no>>>
>> wrote:
>>
>> Hi list,
>>
>> I'm wondering whether there's any code in PETSc that I can either
>> use directly or lift out and adapt for my purposes (which are
>> described in footnote [1]).
>>
>> What I need to do is a number of different 2D array ("multi-vector",
>> "dense matrix") repacking operations, in order to make the data
>> available to different operations (spherical harmonic transforms,
>> dense linear algebra, sparse linear algebra, etc.).
>>
>> There's a number of repacking operations I'm in need of:
>>
>> - From each process having a given contiguous set of rows of a 2D
>> array, to an element-cyclic distribution where one distributes over
>> both rows and columns (the distribution of Elemental [2])
>>
>> - Arbitrary redistribution of rows between processes (but each row
>> is kept in a process), potentially with overlaps.
>>
>> - More structured redistribution of rows between processes where
>> indices can be easily computed on the fly (but still a pattern quite
>> specific to my application).
>>
>> While I found PETSc routines for arbitrary redistribution of 1D
>> arrays, I couldn't find anything for 2D arrays. Any pointers into
>> documentation or papers etc. explaining these aspects of PETSc is
>> very welcome.
>>
>>
>> It sounds like you just want help with indexing. VecScatter can move
>> data of any dimensionality, you just have to know how to index it. (You
>> could ask for something more specialized if you were using the topology
>> routines in MPI and wanted control over how messages are routed.)
>>
>
> Thanks for your response. Now I know what question to ask:
>
> The problem with VecScatter is that it requires indexing all elements,
> which can be rather wasteful in some situations. So I'm wondering if
> VecScatter could potentially work with elements of any size, so that one
> could send a bunch of elements (a picture of this is only indexing rows of
> a row-major 2D array, thus amortizing the indexing overhead and memory use
> by the number of columns).
>
Use PetscSF instead, which can use any MPIDataatype. Its the VecScatter
generalization.
Matt
> (That doesn't cover the Elemental case, where one would need to index
> every element (rather wasteful), but it does cover some of my other needs).
>
> Dag
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120326/4e7effa7/attachment.htm>
More information about the petsc-users
mailing list