[petsc-users] partial stencil in DMDA?

Fischer, Greg A. fischega at westinghouse.com
Fri Feb 21 13:12:10 CST 2014


Thanks!  I have another question. The user manual says:

                PETSc currently provides no container for multiple arrays sharing the same distributed array communication; note, however, that the dof parameter handles many cases of interest.

In my application, each space location will have on the order of hundreds of values associated with it (which I believe translates to dof=O(100) - I don't see the "degrees of freedom" explicitly defined anywhere). However, when communicating ghost values, not all of those degrees of freedom should be exchanged at once. I need to be able to exchange one at a time.

It sounds like what I may want to do is use VecCreateGhost(), which would allow me to define exactly where the ghost points are, and then duplicate that vector using VecDuplicateVecs() for each DOF. I can then scatter the vectors individually as the need arises. Does that sound reasonable?


>-----Original Message-----

>From: Barry Smith [mailto:bsmith at mcs.anl.gov]

>Sent: Friday, February 21, 2014 12:21 PM

>To: Fischer, Greg A.

>Cc: petsc-users at mcs.anl.gov

>Subject: Re: [petsc-users] partial stencil in DMDA?



>On Feb 21, 2014, at 11:02 AM, Fischer, Greg A. <fischega at westinghouse.com<mailto:fischega at westinghouse.com>>



>> Hello,


>> I'm interested in using PETSc to manage distributed arrays. Based on my

>reading about the DMDA objects, I see that ghost points can be

>communicated in box-type stencils or star-type stencils.


>> For my application, assume that I have a 2D DMDA object with star-type

>stencils. For a typical local calculation, I only need to access ghost values

>from two of the four directions at a time. For example, I'd like access to ghost

>values in the South and East directions, but not in the North or West

>directions. Communicating North and West data would seem to be wasting

>bandwidth. Is there any way to accomplish this?


>   Greg,


>    There is not anything built in. Here is what I suggest:


>1)  write your application code not worrying about the fact that the

>DMGlobalToLocalBegin/End() is moving values you don't need.


>2) when your code is running correctly for your problem and giving useful

>results if the communication times are impacting how long it takes to run you

>can provide a custom communication pattern. It would involve little

>additional coding essentially taking DMSetUp_DA_2D() which creates the list

>of ghost points and removing the  unneeded ghost points.  But it would be

>premature to do this optimization until you have a full working application



>   Barry



>> Thanks,

>> Greg



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140221/6495e102/attachment.html>

More information about the petsc-users mailing list