[petsc-dev] DMDA_*PERIODIC and DMDA_XYZGHOSTED

Barry Smith bsmith at mcs.anl.gov
Wed Mar 2 13:06:18 CST 2011


On Mar 2, 2011, at 11:29 AM, Ethan Coon wrote:

> Ok, I've gotten frustrated enough to generalize the DMDA Periodicity to
> allow for periodic, ghosted, or nonghosted individually in each
> dimension.
> 
> The 1D case is done, but one design choice regarding the higher
> dimensional cases that I wanted some advice on.  Is there a reason to
> not use negative indices in the construction of the index sets that
> specify the global-to-local scatter?  In the construction of the
> "to" (local) index set, the STAR_STENCIL case specifically enumerates
> the pattern in both to and from index sets.  However, it would much
> cleaner to:
> 
> 1. specify the full local domain in the "to" local IS
> 2. generate the "from" IS with -1 in the corners (and in the ghost cells
> that have no corresponding global entry, as they are off the domain, in
> the DMDA_XYZGHOSTED case).
> 
> With -1 in the IS, there would be no communication done, but the IS
> would be slightly larger.  Is the overhead associated with negative
> indices nontrivial?  
> 

   Ethan,

   MatSetValues() and VecSetValues() handle negative indices as "ignore these entries".  Currently VecScatterCreate() does not handle negative indices as "ignore these entries" (at least it is not documented and I did not write it), likely it will either crash or generate an error. 

   It sounds like you are proposing that if the from or to entry in a particular "slot" is negative you would like VecScatterCreate() to just ignore that slot? This seems like an ok proposal if you are willing to update VecScatterCreate() to handle it and add to VecScatterCreate() manual page this feature. If this truly simplifies all the horrible if () code in the DA construction to handle corner stuff then it would be worth doing.

   Performance is not an issue since you would just discard those slots in the VecScatterCreate() phase and they would never appear in the actual scatter operations.


   Barry




> Ethan
> 
> 
> 
> On Tue, 2010-12-07 at 15:41 -0700, Ethan Coon wrote:
>> On Tue, 2010-12-07 at 13:45 -0600, Barry Smith wrote:
>>> DMDA_XYZGHOSTED does not exist for 2d and 3d it was added, I'm guessing, as an experiment and was never in the initial design of DMDA. To fully support it one needs to go back tot he design of DMDA and see how to have it properly done and not just bolt it on.  Some people like to use these types of ghost nodes so I agree it is a useful thing to have but who is going to properly add it?
>>> 
>> 
>> At some point in the not-too-distant future I'll get frustrated enough
>> to look into this, but I don't have the time at the moment.  At first
>> glance it looks like:
>> 
>> - Ensure DMDA{X,Y,Z}Periodic() macros are used everywhere instead of
>> direct comparisons to dd->wrap (they aren't used everywhere currently).
>> 
>> - Define macros DMDA{X,Y,Z}Ghosted() to (in some places) replace
>> DMDA{X,Y,Z}Periodic() and then choosing the appropriate macro in the
>> right places.
>> 
>> - This probably doesn't merit a change in the DMDACreate* API (it would
>> affect a very large amount of user code).  The most obvious alternative
>> to an API change would be a larger, somewhat convoluted enum for the
>> PeriodicType (DMDA_XPERIODIC_YGHOSTED, DMDA_XYGHOSTED, etc) which could
>> at least be made backward compatible.
>> 
>> At least all of the functionality should be there already (since it's
>> needed in the periodic case)... it's just higher level code that would
>> need to change.
>> 
>> Ethan
>> 
>>> 
>>> On Dec 7, 2010, at 1:30 PM, Jed Brown wrote:
>>> 
>>>> On Tue, Dec 7, 2010 at 20:21, Ethan Coon <ecoon at lanl.gov> wrote:
>>>> 'd like a DA where there are ghost cells on every boundary, and some of
>>>> those ghost cells (but not all) are filled in with periodic values.
>>>> 
>>>> It would be useful to people doing explicit stuff if there was a way to get ghost nodes in the local vector without implying periodic communication (and weird coordinate management).
>>>> 
>>>> A related issue for purely explicit is to have a way to VecAXPY without needing to copy to and from a global vector.  (TSSSP has low-memory schemes, paying for an extra vector or two is actually significant in that context, and (less significant) I'm certain I can cook up a realistic benchmark where the memcpy costs more than 10%.)  I think I know how to implement this sharing transparently (more-or-less using VecNest) so we could make it non-default but be able to activate it at runtime.
>>> 
>>>   Why can you not use VecAXPY() on the local Vecs?
>>> 
>>>   Barry
>>> 
>>> 
>>> 
>>>> 
>>>> Jed
>>> 
>> 
> 
> -- 
> ------------------------------------
> Ethan Coon
> Post-Doctoral Researcher
> Applied Mathematics - T-5
> Los Alamos National Laboratory
> 505-665-8289
> 
> http://www.ldeo.columbia.edu/~ecoon/
> ------------------------------------
> 




More information about the petsc-dev mailing list