On Mon, Mar 26, 2012 at 2:15 PM, Dag Sverre Seljebotn <span dir="ltr"><<a href="mailto:d.s.seljebotn@astro.uio.no">d.s.seljebotn@astro.uio.no</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="im">On 03/26/2012 12:06 PM, Jed Brown wrote:<br>
</div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
Matt, Jack, and myself have discussed adding proper support for<br>
Elemental with PETSc, but we just haven't had the time. Matt and I don't<br>
have a direct use case and we all have plenty of other things going.<br>
<br>
On Mon, Mar 26, 2012 at 13:53, Dag Sverre Seljebotn<br></div><div><div class="h5">
<<a href="mailto:d.s.seljebotn@astro.uio.no" target="_blank">d.s.seljebotn@astro.uio.no</a> <mailto:<a href="mailto:d.s.seljebotn@astro.uio.no" target="_blank">d.s.seljebotn@astro.<u></u>uio.no</a>>> wrote:<br>
<br>
Hi list,<br>
<br>
I'm wondering whether there's any code in PETSc that I can either<br>
use directly or lift out and adapt for my purposes (which are<br>
described in footnote [1]).<br>
<br>
What I need to do is a number of different 2D array ("multi-vector",<br>
"dense matrix") repacking operations, in order to make the data<br>
available to different operations (spherical harmonic transforms,<br>
dense linear algebra, sparse linear algebra, etc.).<br>
<br>
There's a number of repacking operations I'm in need of:<br>
<br>
- From each process having a given contiguous set of rows of a 2D<br>
array, to an element-cyclic distribution where one distributes over<br>
both rows and columns (the distribution of Elemental [2])<br>
<br>
- Arbitrary redistribution of rows between processes (but each row<br>
is kept in a process), potentially with overlaps.<br>
<br>
- More structured redistribution of rows between processes where<br>
indices can be easily computed on the fly (but still a pattern quite<br>
specific to my application).<br>
<br>
While I found PETSc routines for arbitrary redistribution of 1D<br>
arrays, I couldn't find anything for 2D arrays. Any pointers into<br>
documentation or papers etc. explaining these aspects of PETSc is<br>
very welcome.<br>
<br>
<br>
It sounds like you just want help with indexing. VecScatter can move<br>
data of any dimensionality, you just have to know how to index it. (You<br>
could ask for something more specialized if you were using the topology<br>
routines in MPI and wanted control over how messages are routed.)<br>
</div></div></blockquote>
<br>
Thanks for your response. Now I know what question to ask:<br>
<br>
The problem with VecScatter is that it requires indexing all elements, which can be rather wasteful in some situations. So I'm wondering if VecScatter could potentially work with elements of any size, so that one could send a bunch of elements (a picture of this is only indexing rows of a row-major 2D array, thus amortizing the indexing overhead and memory use by the number of columns).<br>
</blockquote><div><br></div><div>Use PetscSF instead, which can use any MPIDataatype. Its the VecScatter generalization.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
(That doesn't cover the Elemental case, where one would need to index every element (rather wasteful), but it does cover some of my other needs).<span class="HOEnZb"><font color="#888888"><br>
<br>
Dag<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>