[petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

Mike Michell mi.mike1021 at gmail.com
Sat Feb 25 14:11:10 CST 2023


My apologies for the late follow-up. There was a time conflict.

A simple example code related to the issue I mentioned is attached here.
The sample code does: (1) load grid on dm, (2) compute vertex-wise control
volume for each node in a median-dual way, (3) halo exchange among procs to
have complete control volume values, and (4) print out its field as a .vtu
file. To make sure, the computed control volume is also compared with
PETSc-computed control volume via DMPlexComputeCellGeometryFVM() (see lines
771-793).

Back to the original problem, I can get a proper control volume field with
PETSc 3.18.4, which is the latest stable release. However, if I use PETSc
from the main repo, it gives a strange control volume field. Something is
certainly strange around the parallel boundaries, thus I think something
went wrong with halo communication. To help understand, a comparing
snapshot is also attached. I guess a certain part of my code is no longer
compatible with PETSc unless there is a bug in the library. Could I get
comments on it?

Thanks,
Mike


> On Mon, Feb 20, 2023 at 12:05 PM Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Sat, Feb 18, 2023 at 12:00 PM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>> As a follow-up, I tested:
>>>
>>> (1) Download tar for v3.18.4 from petsc gitlab (
>>> https://gitlab.com/petsc/petsc/-/tree/v3.18.4) has no issue on DMPlex
>>> halo exchange. This version works as I expect.
>>> (2) Clone main branch (git clone https://gitlab.com/petsc/petsc.git)
>>> has issues with DMPlex halo exchange. Something is suspicious about this
>>> main branch, related to DMPlex halo. The solution field I got is not
>>> correct. But it works okay with 1-proc.
>>>
>>> Does anyone have any comments on this issue? I am curious if other
>>> DMPlex users have no problem regarding halo exchange. FYI, I do not
>>> declare ghost layers for halo exchange.
>>>
>>
>> There should not have been any changes there and there are definitely
>> tests for this.
>>
>> It would be great if you could send something that failed. I could fix it
>> and add it as a test.
>>
>
> Just to follow up, we have tests of the low-level communication (Plex
> tests ex1, ex12, ex18, ex29, ex31), and then we have
> tests that use halo exchange for PDE calculations, for example SNES
> tutorial ex12, ex13, ex62. THe convergence rates
> should be off if the halo exchange were wrong. Is there any example
> similar to your code that is failing on your installation?
> Or is there a way to run your code?
>
>   Thanks,
>
>      Matt
>
>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks,
>>> Mike
>>>
>>>
>>>> Dear PETSc team,
>>>>
>>>> I am using PETSc for Fortran with DMPlex. I have been using this
>>>> version of PETSc:
>>>> >>git rev-parse origin
>>>> >>995ec06f924a86c4d28df68d1fdd6572768b0de1
>>>> >>git rev-parse FETCH_HEAD
>>>> >>9a04a86bf40bf893fb82f466a1bc8943d9bc2a6b
>>>>
>>>> There has been no issue, before the one with VTK viewer, which Jed
>>>> fixed today (
>>>> https://gitlab.com/petsc/petsc/-/merge_requests/6081/diffs?commit_id=27ba695b8b62ee2bef0e5776c33883276a2a1735
>>>> ).
>>>>
>>>> Since that MR has been merged into the main repo, I pulled the latest
>>>> version of PETSc (basically I cloned it from scratch). But if I use the
>>>> same configure option with before, and run my code, then there is an issue
>>>> with halo exchange. The code runs without error message, but it gives wrong
>>>> solution field. I guess the issue I have is related to graph partitioner or
>>>> halo exchange part. This is because if I run the code with 1-proc, the
>>>> solution is correct. I only updated the version of PETSc and there was no
>>>> change in my own code. Could I get any comments on the issue? I was
>>>> wondering if there have been many changes in halo exchange or graph
>>>> partitioning & distributing part related to DMPlex.
>>>>
>>>> Thanks,
>>>> Mike
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230225/945f79ff/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Test_Version.tar
Type: application/x-tar
Size: 665600 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230225/945f79ff/attachment-0001.tar>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: output_example_snapshot.png
Type: image/png
Size: 249452 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230225/945f79ff/attachment-0001.png>


More information about the petsc-users mailing list