[petsc-users] I suggest to create a demo code for the problem of "Get a value from global vector to a local processor"
Matthew Knepley
knepley at gmail.com
Thu Jan 17 20:20:09 CST 2019
On Thu, Jan 17, 2019 at 9:04 PM leejearl <leejearl at mail.nwpu.edu.cn> wrote:
> Hi Matt,
> Thanks for your reply. We have got to a consensus. For such a problem, I
> have sought the mail list for helps and found a great
> many misleading answers. I think some of answers may work for the DMDA.
> Now I have overcome this
> by creating a PetscSF and PetscSFBcast, and the procedures are described
> as follow.
>
>
> 1. Using DMGLobaltoLocal to update the local vector(the ghost cell mighb
> get the wrong result)
> 2. Creating a PetscSF which matches the local ghost cell and the global
> donor cell across the processes
> 3. Using PetscSFBCast to update the local ghost cell
>
> Now, my code can work. Can you give me a more scalable procedure?
>
That is a scalable procedure. I do not do it that way because it requires
more steps than just meshing the structure directly.
Thanks,
Matt
> Thanks.
>
> On Thu, 2019-01-17 at 06:43 -0500, Matthew Knepley wrote:
>
> On Thu, Jan 17, 2019 at 3:34 AM leejearl via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Hi all Petscer,
> I have ask helps for some questions. Thanks for the replies
> from developer. I have known more about the petsc. I have also sought
> helps in the mail lists, and I find that there are many subjects
> focused on such a problem. Some subjects are listed as follow
>
>
> https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037425.html
> ,
>
>
>
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2015-January/024068.html
> ,
>
>
>
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2008-November/003633.html
> ,
>
>
> https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037436.html
>
> ......
>
> The problem can be summarized as
> 1. Getting the value of vector from other processor
> 2. How to set the value of ghost cell, the value of the ghost cell
> might be as same as a very cell in other processor.
>
> I think the above problem is very popular. I suffer such a problem when
> I want to treat the periodic boundary for the FVM method.
>
>
> Ah, I think I am now seeing the problem. In DMPlex, we do not implement
> periodicity of the mesh by putting in extra communication. We make a
> periodic mesh directly. For example, in 1D what you describe would mesh
> a line segment, and then communicate the values from one end to the other.
> Instead, in Plex we just mesh a circle.
>
> Thanks,
>
> Matt
>
>
> After the dm
> object is distributed, the donor cell of bound might be in other
> processor. Since the donor cell must be matched as a pair correctly,
> using the routines DMGlobaltoLocal and DMLocaltoGlobal can not get the
> expected results.
>
> In fact, such a problem is not very difficult for a mpi program. But we
> are coding with the petsc, we always think that whether we can
> implement more easily.
>
> Whether our developer can create a demo for us to following. I think
> such a demo is very useful for user.
>
> Thanks
>
> leejeal
>
>
>
>
>
>
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190117/2d69153c/attachment-0001.html>
More information about the petsc-users
mailing list