[petsc-dev] Add natural-to-global and global-to-natural operations for DM?

Barry Smith bsmith at mcs.anl.gov
Wed Mar 20 08:49:32 CDT 2013


   Richard,

      Once you have the vector in the "natural order" what do you do with it? 

1) visualize it?
2) store to disk?
3) send to one process so you can do something on it?
3) ???

     Note that VecView() on DMDA vectors automatically manage the mapping to natural ordering when saving to disk so the user does not ever need to apply globaltonatural themselves.

     I would argue that the only "legitimate" use of DMDAGlobalToNatural/NaturalToGlobal() is for IO (or similarly transferring a DMDA vector between two MPI communicators of different sizes) so my inclination is to make DMDAGlobalToNatural() less public, not bring it down to the DM level.

    If you are using GlobalToNatural in your IO I would argue that better approach is to imbed the mapping inside the VecLoad and VecView for your DM unstructured then you do not need need a universal DMGlobalToNatural();

    I do not object to a DMLocalToLocalBegin/End(), though local is not universal we do have the local concept at the DM level already and it will remain there.

   Barry


      
On Mar 19, 2013, at 9:54 PM, Richard Tran Mills <rtm at eecs.utk.edu> wrote:

> Hi Barry,
> 
> What we call the "natural" ordering in PFLOTRAN is what is referred to as the "application" ordering in PETSc parlance.  In the structured grid case this "application" ordering is the natural ordering used by the DMDA routines like DMDANaturalToGlobalBegin()/End(), and in the unstructured case it is whatever ordering was used to assign cell IDs when the unstructured grid was created.  Rather than having code for two cases, I would prefer to be able to just call DMNaturalToGlobalBegin()/End() for either case.  I do realize, of course, that for the DMDA case there is always a default meaning for the "natural" ordering, whereas for the case of a general DM the notion of "natural" will be application dependent and will have to be specified by user code (which I'd like to do in this case by having a DMShellSetNaturalToGlobalVecScatter()).
> 
> Obviously, there may be subtypes of DM for which no "natural to global" operation makes sense.  What I'm saying is that I nonetheless would like to have a general DMNaturalToGlobalBegin()/End() that will do the appropriate thing if it makes sense for that DM and complain otherwise.  I guess this is a question of what the PETSc philosophy is regarding such things.  I note that there is at least one example of a current DMDA method that ought to just be a DM one: I don't see why we should have DMDALocalToLocalBegin()/End() and not just DMLocalToLocalBegin()/End(), and I'd like to go ahead and change this.  But LocalToLocal makes sense for lots of DMs, and NaturalToGlobal may not.
> 
> --Richard
> 
> On 3/7/13 10:51 PM, Barry Smith wrote:
>> On Mar 7, 2013, at 9:46 PM, Richard Tran Mills <rtm at eecs.utk.edu> wrote:
>> 
>>> Continuing on my quest to be able to wrap all of the unstructured grid operations in PFLOTRAN inside a DM: Would it be OK to add DMGlobalToNaturalBegin()/End() and DMNatualToGlobalBegin()/End()? These are currently DMDA routines, not DM ones.  We do something that we call natural-to-global and global-to-natural operations with our unstructured grid cases in PFLOTRAN (though I'm not sure if "natural" is quite the thing to call it... I always think of that in terms of a structured grid layout).  I would like to just call DMGlobalToNaturalBegin()/End(), etc., on our DM, whether it is a DMDA or a DMShell.
>>> 
>>> I had already mentioned that I plan to add DM interface routines for local-to-local operations.  I am thinking that global-to-natural, etc., also ought to be added, though I'm not sure if the concept of "natural" ordering necessarily makes sense in some cases.
>>    What does natural mean for you on anything but a DA in what you do with pflotran? Once we all understand than we can come up with ideas on how that fits with DM.
>> 
>>    Barry
>> 
>>> --Richard
>>> 
>>> -- 
>>> Richard Tran Mills, Ph.D.
>>> Computational Earth Scientist      | Joint Assistant Professor
>>> Hydrogeochemical Dynamics Team     | EECS and Earth & Planetary Sciences
>>> Oak Ridge National Laboratory      | University of Tennessee, Knoxville
>>> E-mail: rmills at ornl.gov  V: 865-241-3198 http://climate.ornl.gov/~rmills
>>> 
> 
> 
> -- 
> Richard Tran Mills, Ph.D.
> Computational Earth Scientist      | Joint Assistant Professor
> Hydrogeochemical Dynamics Team     | EECS and Earth & Planetary Sciences
> Oak Ridge National Laboratory      | University of Tennessee, Knoxville
> E-mail: rmills at ornl.gov  V: 865-241-3198 http://climate.ornl.gov/~rmills
> 




More information about the petsc-dev mailing list