[petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used
Matthew Knepley
knepley at gmail.com
Tue Apr 26 14:26:15 CDT 2022
On Tue, Apr 26, 2022 at 9:33 AM Mike Michell <mi.mike1021 at gmail.com> wrote:
> Thank you for the answers.
> For the first question, basically, I cannot
> run "/dm/impls/plex/ex1f90.F90" example with more than 1 proc. I removed
> DMPlexDistribute() following your comment and what I tried is:
>
> - no modification to ex1f90.F90 (as it is)
> - make "ex1f90"
> - mpirun -np 2 ./ex1f90
>
> It gives me "Bad termination of one of ..." for Rank 1. The code runs okay
> with "mpirun -np 1 ./ex1f90".
>
You are correct. It evidently never worked in parallel, since those checks
will only work in serial.
I have fixed the code, and added a parallel test. I have attached the new
file, but it is also in this MR:
https://gitlab.com/petsc/petsc/-/merge_requests/5173
Thanks,
Matt
> Thanks,
> Mike
>
> 2022년 4월 26일 (화) 오전 3:56, Matthew Knepley <knepley at gmail.com>님이 작성:
>
>> On Mon, Apr 25, 2022 at 9:41 PM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>> Dear PETSc developer team,
>>>
>>> I'm trying to learn DMPlex to build a parallel finite volume code in 2D
>>> & 3D. More specifically, I want to read a grid from .msh file by Gmsh.
>>> For practice, I modified /dm/impls/plex/ex1f90.F90 case to read &
>>> distribute my sample 2D grid, which is attached. I have two questions as
>>> below:
>>>
>>> (1) First, if I do not use my grid, but use the default box grid built
>>> by ex1f90.F90 and if I try "DMPlexDistribute" over mpi processors, the
>>> output file (sol.vtk) has only some portion of the entire mesh. How can I
>>> print out the entire thing into a single file? Is there any example for
>>> parallel output? (Related files attached to "/Question_1/") Paraview gives
>>> me some error messages about data size mismatching.
>>>
>>
>> For the last release, we made parallel distribution the default. Thus,
>> you do not need to call DMPlexDIstribute() explicitly here. Taking it out,
>> I can run your example.
>>
>>
>>> (2) If I create DMPlex object through "DMPlexCreateGmshFromFile",
>>> "DMPlexCreateSection" part is crashed. I do not understand why my example
>>> code does not work, because the only change was switching from "DMCreate"
>>> to "DMPlexCreateGmshFromFile" and providing "new.msh" file. Without the
>>> PetscSection object, the code works fine. Any comments about this? (Related
>>> files attached to "/Question_2/")
>>>
>>
>> If I remove DMPlexDistribute() from this code, it is clear that the
>> problem is with the "marker" label. We do not create this by default from
>> GMsh since we assume people have defined their own labels. You can pass
>>
>> -dm_plex_gmsh_use_marker
>>
>> to your code. WHen I do this, you example runs for me.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Thanks,
>>> Mike
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220426/b4eecf2c/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex1f90.F90
Type: application/octet-stream
Size: 4705 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220426/b4eecf2c/attachment.obj>
More information about the petsc-users
mailing list