[petsc-users] partition of DM Vec entries

Sang pham van pvsang002 at gmail.com
Fri Oct 14 22:20:37 CDT 2016


Hi Barry,

Thank you very much for your suggestions and comments. I am very
appreciated that!

WIth my best regards,


On Sat, Oct 15, 2016 at 10:13 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   Unless the particles are more or less equally distributed over the the
> entire domain any kind of "domain decomposition" approach is questionably
> for managing the particles. Otherwise certain processes that have domains
> that contain most of the particles will have a great deal of work, for all
> of its particles, while domains with few particles will have little work. I
> can see two approaches to alleviate this problem.
>
> 1) constantly adjust the sizes/locations of the domains to load balance
> the particles per domain or
>
> 2)  parallelize the particles (some how) instead of just the geometry.
>
> Anyways, there is a preliminary DMSWARM class in the development version
> of PETSc for helping to work with particles provided by Dave May. You might
> look at it. I don't know if it would useful for you or not. IMHO software
> library support for particle methods is still very primitive compared to
> finite difference/element support, in other words we still have a lot to do.
>
>
>   Barry
>
>
>
>
>
> > On Oct 14, 2016, at 9:54 PM, Sang pham van <pvsang002 at gmail.com> wrote:
> >
> > Hi Barry,
> >
> > Thank your for your answer. I am writing a parallel code for
> smoothed-particle hydrodynamic, in this code I used a DMDA background mesh
> for management of particles. Each DMDA cell manages a number of particles,
> the number can change in both time and cell. In each time step, I need to
> update position and velocity of particles in border cells to neighbor
> partition. I think I can not use DMDA Vec to do this be cause the number of
> particles is not the same in all ghost cells.
> >
> > I think I am able to write a routine do this work, but the code may be
> quite complicated and not so "formal", I would be very appreciated if you
> can suggest a method to solve my problem.
> >
> > Many thanks.
> >
> >
> >
> >
> > On Sat, Oct 15, 2016 at 9:40 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >   Thanks, the question is very clear now.
> >
> >   For DMDA you can use DMDAGetNeighborsRank() to get the list of the (up
> to) 9 neighbors of a processor. (Sadly this routine does not have a manual
> page but the arguments are obvious). For other DM I don't think there is
> any simple way to get this information. For none of the DM is there a way
> to get information about what process is providing a specific ghost cell.
> >
> >   It is the "hope" of PETSc (and I would think most parallel computing
> models) that the details of exactly what process is computing neighbor
> values should not matter for your own computation. Maybe if you provide
> more details on how you wish to use this information we may have
> suggestions on how to proceed.
> >
> >   Barry
> >
> >
> >
> > > On Oct 14, 2016, at 9:23 PM, Sang pham van <pvsang002 at gmail.com>
> wrote:
> > >
> > > Hi Barry,
> > >
> > > In 2 processes case, the problem is simple, as I know all ghost cells
> of partition 0 are updated from partition 1. However, in the case of many
> processes, how do I know from which partitions ghost cells of partition 0
> are updated? In other words, How can I know neighboring partitions of the
> partition 0? and can I get a list of ghost cells managing by a neighboring
> partition?
> > > Please let me know if my question is still not clear.
> > >
> > > Many thanks.
> > >
> > >
> > > On Sat, Oct 15, 2016 at 8:59 AM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> > >
> > > > On Oct 14, 2016, at 8:50 PM, Sang pham van <pvsang002 at gmail.com>
> wrote:
> > > >
> > > > Hi,
> > > >
> > > > I am using DM Vec for a FV code, for some reasons, I want to know
> partition of all ghost cells of a specific partition. is there a way do
> that?
> > >
> > >   Could you please explain in more detail what you want, I don't
> understand? Perhaps give a specific example with 2 processes?
> > >
> > >  Barry
> > >
> > >
> > >
> > > >
> > > > Many thanks.
> > > >
> > > > Best,
> > > >
> > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161015/a5ce60e0/attachment-0001.html>


More information about the petsc-users mailing list