[petsc-users] DMLocalToLocal with DMPlex in Fortran

Mike Michell mi.mike1021 at gmail.com
Wed Oct 12 11:02:18 CDT 2022


Thanks a lot for the reply.

It is possible that it will make a difference. An explicit code, running
completely from local vectors, which assembles its own residuals without
mapping to global vectors,
could probably realize some gain. It would be interesting to see the
numbers.
=> Agree with this. Even with an implicit scheme, I need the halo exchange
of local-to-local while constructing the local matrix object (for example,
summation over control volume components). Thus I expect some computational
gain with this.

In December, will try to remind you via this email chain.

Thanks,
Mike


> On Tue, Oct 11, 2022 at 6:04 PM Mike Michell <mi.mike1021 at gmail.com>
> wrote:
>
>> Thank you for the reply and checking.
>> Indeed, it seems that local-to-local halo is still not implemented for
>> DMPlex. But I believe this is a very required feature for large 3D
>> simulations with DMPlex.
>>
>
> It is possible that it will make a difference. An explicit code, running
> completely from local vectors, which assembles its own residuals without
> mapping to global vectors,
> could probably realize some gain. It would be interesting to see the
> numbers.
>
>
>> Would you mind if I ask for any estimated timeline to fix this issue and
>> put it on the official version of PETSc? If I remember correctly, we had a
>> similar discussion a few months ago.
>>
>
> I cannot do it until November since we have a big review coming up.
>
>   Thanks,
>
>      Matt
>
>
>> Thanks,
>> Mike
>>
>> On Mon, Oct 10, 2022 at 10:41 PM Mike Michell <mi.mike1021 at gmail.com>
>>> wrote:
>>>
>>>> Hi, I was wondering if there is any comment on the example file that I
>>>> can refer to.
>>>>
>>>
>>> I see the problem. Local2Local is not implemented for Plex. I thought we
>>> had this automated, but it was only
>>> coded for DMDA. It is a fairly mechanical transformation of the
>>> Global2Local, just remapping indices, but it
>>> will take some time since there is a lot of backlog this semester.
>>>
>>> I have fixed the error message so now it is obvious what the problem is.
>>>
>>>   Thanks,
>>>
>>>       Matt
>>>
>>>
>>>> Thanks,
>>>> Mike
>>>>
>>>>
>>>>> Thank you for the reply.
>>>>> Sure, a short example code is attached here with a square box mesh and
>>>>> a run script.
>>>>> Inside the source, you may find two versions of halo exchange; one is
>>>>> for local to global (Version-1) and another one is for local to local
>>>>> (Version-2), which is not working in my case.
>>>>> In the output.vtu, you will see the halo exchanged vector resolved to
>>>>> each vertex with (myrank + 1), so if the code is running with 2procs, at
>>>>> the parallel boundary, you will see 3. In this example, there is no ghost
>>>>> layer.
>>>>>
>>>>> Thanks,
>>>>> Mike
>>>>>
>>>>>
>>>>>> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell <mi.mike1021 at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Thank you for the reply. There is that file in
>>>>>>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned.
>>>>>>>
>>>>>>> After "make allfortranstubs" was done and, PETSc reconfigured and
>>>>>>> reinstalled.
>>>>>>>
>>>>>>> However, I still have the same problem at the line in which
>>>>>>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as
>>>>>>> follows;
>>>>>>> - declare DMPlex
>>>>>>> - PetscSectionCreate()
>>>>>>> - PetscSectionSetNumFields()
>>>>>>> - PetscSectionSetFieldComponents()
>>>>>>> - PetscSectionSetChart()
>>>>>>> - do loop over dofs: PetscSectionSetDof() and
>>>>>>> PetscSectionSetFieldDof()
>>>>>>> - PetscSectionSetUp()
>>>>>>> - DMSetLocalSection()
>>>>>>> - PetscSectionDestroy()
>>>>>>> - DMGetSectionSF()
>>>>>>> - PetscSFSetUp()
>>>>>>>
>>>>>>> Then, the halo exchange is performed as follows;
>>>>>>> - DMGetLocalVector()
>>>>>>> - Fill the local vector
>>>>>>> - DMLocalToLocalBegin() --(*)
>>>>>>> - DMLocalToLocalEnd()
>>>>>>> - DMRestoreLocalVector()
>>>>>>>
>>>>>>> Then, the code crashes at (*).
>>>>>>>
>>>>>>
>>>>>> Can you send something I can run? Then I will find the problem and
>>>>>> fix it.
>>>>>>
>>>>>>   Thanks,
>>>>>>
>>>>>>      Matt
>>>>>>
>>>>>>
>>>>>>> Previously(at the time PETSc did not support LocalToLocal for DMPlex
>>>>>>> in Fortran), the part above, "DMLocalToLocalBegin() and
>>>>>>> DMLocalToLocalEnd()", consisted of;
>>>>>>> - DMLocalToGlobalBegin()
>>>>>>> - DMLocalToGlobalEnd()
>>>>>>> - DMGlobalToLocalBegin()
>>>>>>> - DMGlobalToLocalEnd()
>>>>>>> and it worked okay.
>>>>>>>
>>>>>>> I am unclear which part is causing the problem. Shall I define the
>>>>>>> PetscSection and PetscSF in different ways in case of Local to Local Halo
>>>>>>> exchange?
>>>>>>> Any comment will be appreciated.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Mike
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell <mi.mike1021 at gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> As a follow-up to this email thread,
>>>>>>>>> https://www.mail-archive.com/petsc-users@mcs.anl.gov/msg44070.html
>>>>>>>>>
>>>>>>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available
>>>>>>>>> for DMPlex with Fortran on the latest version of PETSc (3.17.99 from
>>>>>>>>> GitLab)? Matt commented that the Fortran bindings were updated so that
>>>>>>>>> those functions must be available in the latest version of PETSc, however,
>>>>>>>>> it seems still they are not working from my test with DMPlex in Fortran.
>>>>>>>>> Can anyone provide some comments? Probably I am missing some mandatory
>>>>>>>>> header file? Currently, I have headers;
>>>>>>>>>
>>>>>>>>> #include "petsc/finclude/petscvec.h"
>>>>>>>>> #include "petsc/finclude/petscdmplex.h"
>>>>>>>>> #include "petsc/finclude/petscdmlabel.h"
>>>>>>>>> #include "petsc/finclude/petscdm.h"
>>>>>>>>>
>>>>>>>>
>>>>>>>> The source for these functions is in
>>>>>>>>
>>>>>>>>   src/dm/ftn-auto/dmf.c
>>>>>>>>
>>>>>>>> Is it there for you? If not, you can run
>>>>>>>>
>>>>>>>>   make allfortranstubs
>>>>>>>>
>>>>>>>> Fortran functions are not declared, so the header should not matter
>>>>>>>> for compilation, just the libraries for linking.
>>>>>>>>
>>>>>>>>   Thanks,
>>>>>>>>
>>>>>>>>      Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Mike
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221013/f8714740/attachment-0001.html>


More information about the petsc-users mailing list