[petsc-users] DMDA and local-to-global scatters

Coon, Ethan ecoon at lanl.gov
Tue Nov 3 11:22:37 CST 2015


Thanks much for your thoughts Barry, I’ll holler again as I have more.

Ethan

------------------------------------------------------------------------
Ethan Coon
Research Scientist
Computational Earth Science -- EES-16
Los Alamos National Laboratory
505-665-8289

http://www.lanl.gov/expertise/profiles/view/ethan-coon
------------------------------------------------------------------------

> On Oct 29, 2015, at 4:37 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
> 
>  Ethan,
> 
>   The truth is I had to introduce the restriction because the previous code was a memory hog and difficult to maintain. I couldn't figure any way to eliminate the memory hog business except by eliminating the localtolocal feature.
> 
>   My recommendation is to not try to rewrite the local to local map which is horrible but instead try to use a global vector for the computed result. Hopefully you can do this in a way to that doesn't mess up the entire code.  In the simplest case one would have something like 
> 
>   for t=0 to many time steps
>      DMGlobalToLocal (xglobal, xlocal)
>      DMDAVecGetArray on xlocal and global
>       update xglobal arrays based on xlocal arrays
>      restore arrays
>  end
> 
>  If the timesteping is controlled elsewhere so your routine can only do a single time step at a time then something like
> 
>    function whose input is a local (ghosted) vector
>    DMDAGetGlobalVector( ... &xglobal);
>    DMDAVecGetArray on xlocal and global
>       update xglobal arrays based on xlocal arrays
>   restore arrays
>    DMDAGlobalToLocal(xglobal, xlocal);
>    DMDARestoreGlobalVector(xglobal);
> 
>   thus you can "hide" the global vector in the routine that does only the update.
> 
>   If this doesn't help then send more specifics of exactly where your localtolocal() map is used and I may have suggestions. I convinced myself that one could 
> always use a global intermediate vector in the computation without additional communication to replace the localtolocal but I could be wrong.
> 
>   Sorry about the change
> 
> 
>  Barry
> 
> 
>> On Oct 29, 2015, at 5:18 PM, Coon, Ethan <ecoon at lanl.gov> wrote:
>> 
>> Hi Barry, all,
>> 
>> I’m trying to understand some extremely old code (4 years now!) that people insist on using and therefore having me support, despite the fact that the DOE doesn’t think it is important enough to pay me to support.
>> 
>> Barry — can you explain to me the history and logic in your change:
>> 
>> https://bitbucket.org/petsc/petsc/commits/bd1fc5ae41626b6cf1674a6070035cfd93e0c1dd
>> 
>> that removed DMDA’s local-to-global map, in favor of doing the reverse scatter on global-to-local?  When I wrote the aforementioned LBM code, the local-to-global scatter had different semantics than the reverse global-to-local scatter.  The latter merged owned and ghosted values from the local Vec into the owned global Vec, while the former ignored ghost values and did a direct copy from owned local values to owned global values.  Under INSERT_VALUES, this was important as the latter was a race condition while the former was a well-posed operation.
>> 
>> Now I see that this L2G scatter has been removed (in 2014), which introduced a race condition.  It was tweaked a bit for 1 process by this commit:
>> 
>> https://bitbucket.org/petsc/petsc/commits/1eb28f2e8c580cb49316c983b5b6ec6c58d77ab8
>> 
>> which refers to a Lisandro email that I’m having trouble finding.  Fortunately this caused some errors that I did see, as opposed to the previous race conditions which I didn’t see.
>> 
>> Additionally there was documentation added to not do INSERT_VALUES with DMDAs at some point.  (Maybe because it causes race conditions!)  This documentation suggests to "simply compute the values directly into a global vector instead of a local one."  This isn’t a good choice in my application, where I do many time steps in local vectors, using repeated calls to “DMLocalToLocalBegin()”, and only go back to the global vector when I want to do i/o.  Computing into global vectors requires managing two Vecs throughout the entire code, when otherwise I only manage one (except in the i/o portion of the code).  I guess the answer is to create and use a L2G forward scatter myself?  Is there a better solution I’m not thinking of?
>> 
>> Thanks,
>> 
>> Ethan
>> 
>> 
>> 
>> ------------------------------------------------------------------------
>> Ethan Coon
>> Research Scientist
>> Computational Earth Science -- EES-16
>> Los Alamos National Laboratory
>> 505-665-8289
>> 
>> http://www.lanl.gov/expertise/profiles/view/ethan-coon
>> ------------------------------------------------------------------------
>> 
> 



More information about the petsc-users mailing list