[petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

Matthew Knepley knepley at gmail.com
Fri Apr 29 15:49:10 CDT 2022


On Fri, Apr 29, 2022 at 12:40 PM Mike Michell <mi.mike1021 at gmail.com> wrote:

> Thanks for helpful idea. For "median-dual" control volume, which is shaped
> by connecting {cell centroids - face centroids - edge midpoints}, I
> realized below:
> (a) Verticies in original DMPlex (read from a mesh file) will be a list of
> cells in a reconstructed median-dual DMPlex.
> (b) Cells, and interpolated edges and faces in the original DMPlex will be
> a list of vertices in the median-dual DMPlex.
> (c) Once the median-dual DMPlex built by (a) and (b), new faces and edges
> can be interpolated.
>
> Is there a smart way to create new DMPlex by doing (a) and (b)? I was
> thinking about writing the original topologies into a dat file then using
> DMPlexCreateCellVertexFromFile() to create new plex, but it looks not an
> efficient way. How can I say to new DMPlex to mapping all the original
> vertices into the new cell list in DAG? Any recommendations?
>

I think we could do at least two two things:

  1) Just change all calls to DMPlexCone() <-> DMPlexSupport(). This could
happen just by wrapping up the calls. This way no data structures change.
However,
      we would have to be careful that nothing looked directly at the data.

  2) We could reverse the points storage in-place. This is a little more
intrusive, but everything would work seamlessly.
       It would take more work to do this in parallel, but not all that
much.

  Thanks,

     Matt


> Thanks,
> Mike
>
> 2022년 4월 29일 (금) 오전 7:51, Matthew Knepley <knepley at gmail.com>님이 작성:
>
>> On Fri, Apr 29, 2022 at 8:27 AM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>>
>>> Thanks for the answers and I agree. Creating dual mesh when the code
>>> starts and uses that dm will be the easiest way.
>>> But it is confusing how to achieve that. The entire DAG of the original
>>> mesh should change, except for vertices. How can PETSc rebuild DAG for
>>> cells and edges (and faces for 3D)?
>>>
>>
>> In this paper (https://arxiv.org/pdf/0908.4427.pdf), Section 3.1, we
>> show that the dual is just the original DAG with the arrows
>> reversed, so it is trivial to construct. There are two issues:
>>
>>   1) You have to compute the geometry once you have the topology, but you
>> already do that
>>
>>   2) In parallel you will produce a mesh with cell overlap, but I think
>> this is normal for dual methods. We have special methods
>>       to get around this (we use them in mesh partitioning), which we can
>> use to parallelize this step.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> In 2D for example, I need to reconstruct median-dual edges and cells by
>>> connecting the cell-centroids and edge-centers in the original mesh.
>>>
>>> Thanks,
>>> Mike
>>>
>>> On Wed, Apr 27, 2022 at 11:31 PM Mike Michell <mi.mike1021 at gmail.com>
>>>> wrote:
>>>>
>>>>> Thanks for the answers. Major reason that not to store those dual
>>>>> cells is to share the same grid file with other codes that are required for
>>>>> coupled physics modeling.
>>>>>
>>>>
>>>> Yes, I would create the dual on startup and throw away the original
>>>> mesh.
>>>>
>>>>
>>>>> As a follow-up question, there is struggling to use PetscSection. I
>>>>> attached my example code.
>>>>> The PETSc-provided example, "ex1f90.F90" uses DMPlexCreateSection() to
>>>>> make a PetscSection object.
>>>>> However, by reading .msh & creating a DMPlex, none of label is working
>>>>> with that, except for 'marker' label, which should be provided through
>>>>> "-dm_plex_gmsh_use_marker" option in case gmsh file is used.
>>>>>
>>>>
>>>> I have replaced your startup code with the canonical code we use in C
>>>> examples. It should be more flexible and robust.
>>>> Here is the concise output for your mesh:
>>>>
>>>> master *:~/Downloads/tmp/Mitchell/Question_3$ ./test -dm_plex_filename
>>>> new.msh -dm_view
>>>> DM Object: 1 MPI processes
>>>>   type: plex
>>>> DM_0x84000000_1 in 2 dimensions:
>>>>   Number of 0-cells per rank: 44
>>>>   Number of 1-cells per rank: 109
>>>>   Number of 2-cells per rank: 66
>>>> Labels:
>>>>   celltype: 3 strata with value/size (0 (44), 3 (66), 1 (109))
>>>>   depth: 3 strata with value/size (0 (44), 1 (109), 2 (66))
>>>>   Face Sets: 4 strata with value/size (5 (5), 7 (5), 6 (5), 8 (5))
>>>>
>>>> The original code was built to take GMsh markers to canonical names
>>>> because users wanted that. However, if you want your names preserved, you
>>>> can use
>>>>
>>>> master *:~/Downloads/tmp/Mitchell/Question_3$ ./test -dm_plex_filename
>>>> new.msh -dm_view -dm_plex_gmsh_use_regions
>>>>  MPI information ...            0           1
>>>> DM Object: 1 MPI processes
>>>>   type: plex
>>>> DM_0x84000000_1 in 2 dimensions:
>>>>   Number of 0-cells per rank: 44
>>>>   Number of 1-cells per rank: 109
>>>>   Number of 2-cells per rank: 66
>>>> Labels:
>>>>   celltype: 3 strata with value/size (0 (44), 3 (66), 1 (109))
>>>>   depth: 3 strata with value/size (0 (44), 1 (109), 2 (66))
>>>>   inlet: 1 strata with value/size (5 (5))
>>>>   upper: 1 strata with value/size (7 (5))
>>>>   outlet: 1 strata with value/size (6 (5))
>>>>   lower: 1 strata with value/size (8 (5))
>>>>
>>>> There is also code to mark vertices, which some users do not want, that
>>>> you can turn on using -dm_plex_gmsh_mark_vertices. I use both of these
>>>> options with PyLith.
>>>>
>>>>
>>>>> In my .msh file, there are several boundary conditions, and if I
>>>>> declare labels using them and try to give them to DMGetStratumIS(), the
>>>>> code crashed. It only works with 'marker' label. Any comments?
>>>>> More importantly, trying to use DMPlexCreateSection() is quite
>>>>> painful.
>>>>>
>>>>
>>>> What is not working for you?
>>>>
>>>>
>>>>> Therefore, I tried to create PetscSection object through
>>>>> PetscSectionCreate() as attached. But declared vectors are not printed out
>>>>> if I use this way.
>>>>>
>>>>
>>>> If you used the default viewer (PETSC_VIEWER_STDOUT_WORLD), you can see
>>>> them printed. You are using VTK with a Section that put values on all mesh
>>>> points. VTK only understands cell fields and vertex fields, so the code
>>>> does not know how to output this vector. If you define values only on cells
>>>> or vertices, you will see the output. I believe.
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> I guess the section object is not properly linked to dm object.
>>>>> Basically, my vector objects (x, y, vel) are not seen from dm viewer &
>>>>> relevant output file. Do you have any recommendations?
>>>>>
>>>>> Thanks,
>>>>> Mike
>>>>>
>>>>>
>>>>> On Tue, Apr 26, 2022 at 7:27 PM Mike Michell <mi.mike1021 at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Thanks for the answers. One last question related to designing my
>>>>>>> code.
>>>>>>> What I want to do is to build a finite-volume code discretized on
>>>>>>> "median dual" unstructured mesh. So basically it will use a vertex-centered
>>>>>>> discretization scheme by looping over "median dual faces", which is not
>>>>>>> physically existing in my mesh file.
>>>>>>>
>>>>>>> I can acheive this by allocating some vectors to compute required
>>>>>>> geometrical metrics (e.g., face normal vectors, face area, etc. that
>>>>>>> required to integrate over median dual control volume) as I did before even
>>>>>>> without PETSc. However, I think this definitely cannot be an optimal way,
>>>>>>> and I believe there should be much larger benefit that can be obtained, if
>>>>>>> I well design my DMPlex's data structure (e.g., PetscSection).
>>>>>>> Is there any example with this median dual control volume case? or
>>>>>>> Any guideline will be helpful.
>>>>>>> I was thinking to borrow some functions designed for finite element.
>>>>>>>
>>>>>>
>>>>>> I have never done a dual scheme before, so let me ask some questions
>>>>>> to better understand the underlying ideas.
>>>>>>
>>>>>> 1) Is there a reason not to store the dual directly? That seems like
>>>>>> the easiest approach.
>>>>>>
>>>>>> 2) You can use Plex to layout auxiliary geometric data. I do it the
>>>>>> following way:
>>>>>>
>>>>>>   DMClone(dm, dmGeom);
>>>>>>   <Create the Section s you want>
>>>>>>   DMSetLocalSection(dmGeom, s);
>>>>>>   PetscSectionDestroy(&s);
>>>>>>
>>>>>> then
>>>>>>
>>>>>>   DMCreateLocalVector(dmGeom, &lv);
>>>>>>
>>>>>> will give you a Vec with the local data in it that can be addressed
>>>>>> by mesh point (global vectors too). Also you would be able
>>>>>> to communicate this data if the mesh is redistributed, or replicate
>>>>>> this data if you overlap cells in parallel.
>>>>>>
>>>>>>   Thanks,
>>>>>>
>>>>>>      Matt
>>>>>>
>>>>>>
>>>>>>> Thanks,
>>>>>>> Mike
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Apr 26, 2022 at 4:48 PM Mike Michell <mi.mike1021 at gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Below two interesting things are found:
>>>>>>>>> - If DMPlexCreateGmshFromFile() is used, DMPlexDistribute() is not
>>>>>>>>> done by default. I should call DMPlexDistribute() to distribute DMPlex over
>>>>>>>>> the procs.
>>>>>>>>>
>>>>>>>>
>>>>>>>> Here is the explanation. The default distribution comes from
>>>>>>>> DMSetFromOptions(), which I am guessing was not called after
>>>>>>>> DMPlexCreateGmshFromFile().
>>>>>>>>
>>>>>>>>
>>>>>>>>> - If {DMCreate(), DMSetType(), DMSetFromOptions()} in serially
>>>>>>>>> used with "-dm_plex_filename ./new.msh" option in command line,
>>>>>>>>> DMPlexDistribute() is done by default even though DMPlex object is created
>>>>>>>>> by reading the same .msh file.
>>>>>>>>>
>>>>>>>>> Thanks for the modification to the example ex1f90.F90, now it
>>>>>>>>> works with 2 procs.
>>>>>>>>> But still, if I print my output to "sol.vtk", the file has only a
>>>>>>>>> part of whole mesh. However, the file is okay if I print to "sol.vtu".
>>>>>>>>> From "sol.vtu" I can see the entire field with rank. Is using .vtu
>>>>>>>>> format preferred by petsc?
>>>>>>>>>
>>>>>>>>
>>>>>>>> VTK is generally for debugging, but it should work. I will take a
>>>>>>>> look.
>>>>>>>>
>>>>>>>> VTU and HDF5 are the preferred formats.
>>>>>>>>
>>>>>>>>   Thanks,
>>>>>>>>
>>>>>>>>      Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Apr 26, 2022 at 9:33 AM Mike Michell <
>>>>>>>>>> mi.mike1021 at gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Thank you for the answers.
>>>>>>>>>>> For the first question, basically, I cannot
>>>>>>>>>>> run "/dm/impls/plex/ex1f90.F90" example with more than 1 proc. I removed
>>>>>>>>>>> DMPlexDistribute() following your comment and what I tried is:
>>>>>>>>>>>
>>>>>>>>>>> - no modification to ex1f90.F90 (as it is)
>>>>>>>>>>> - make "ex1f90"
>>>>>>>>>>> - mpirun -np 2 ./ex1f90
>>>>>>>>>>>
>>>>>>>>>>> It gives me "Bad termination of one of ..." for Rank 1. The code
>>>>>>>>>>> runs okay with "mpirun -np 1 ./ex1f90".
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> You are correct. It evidently never worked in parallel, since
>>>>>>>>>> those checks will only work in serial.
>>>>>>>>>> I have fixed the code, and added a parallel test. I have attached
>>>>>>>>>> the new file, but it is also in this MR:
>>>>>>>>>>
>>>>>>>>>>   https://gitlab.com/petsc/petsc/-/merge_requests/5173
>>>>>>>>>>
>>>>>>>>>>   Thanks,
>>>>>>>>>>
>>>>>>>>>>      Matt
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Thanks,
>>>>>>>>>>> Mike
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Mon, Apr 25, 2022 at 9:41 PM Mike Michell <
>>>>>>>>>>>> mi.mike1021 at gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Dear PETSc developer team,
>>>>>>>>>>>>>
>>>>>>>>>>>>> I'm trying to learn DMPlex to build a parallel finite volume
>>>>>>>>>>>>> code in 2D & 3D. More specifically, I want to read a grid from .msh file by
>>>>>>>>>>>>> Gmsh.
>>>>>>>>>>>>> For practice, I modified /dm/impls/plex/ex1f90.F90 case to
>>>>>>>>>>>>> read & distribute my sample 2D grid, which is attached. I have two
>>>>>>>>>>>>> questions as below:
>>>>>>>>>>>>>
>>>>>>>>>>>>> (1) First, if I do not use my grid, but use the default box
>>>>>>>>>>>>> grid built by ex1f90.F90 and if I try "DMPlexDistribute" over mpi
>>>>>>>>>>>>> processors, the output file (sol.vtk) has only some portion of the entire
>>>>>>>>>>>>> mesh. How can I print out the entire thing into a single file? Is there any
>>>>>>>>>>>>> example for parallel output? (Related files attached to "/Question_1/")
>>>>>>>>>>>>> Paraview gives me some error messages about data size mismatching.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> For the last release, we made parallel distribution the
>>>>>>>>>>>> default. Thus, you do not need to call DMPlexDIstribute() explicitly here.
>>>>>>>>>>>> Taking it out, I can run your example.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> (2) If I create DMPlex object through
>>>>>>>>>>>>> "DMPlexCreateGmshFromFile", "DMPlexCreateSection" part is crashed. I do not
>>>>>>>>>>>>> understand why my example code does not work, because the only change was
>>>>>>>>>>>>> switching from "DMCreate" to "DMPlexCreateGmshFromFile" and providing
>>>>>>>>>>>>> "new.msh" file. Without the PetscSection object, the code works fine. Any
>>>>>>>>>>>>> comments about this? (Related files attached to "/Question_2/")
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> If I remove DMPlexDistribute() from this code, it is clear that
>>>>>>>>>>>> the problem is with the "marker" label. We do not create this by default
>>>>>>>>>>>> from GMsh since we assume people have defined their own labels. You can pass
>>>>>>>>>>>>
>>>>>>>>>>>>   -dm_plex_gmsh_use_marker
>>>>>>>>>>>>
>>>>>>>>>>>> to your code. WHen I do this, you example runs for me.
>>>>>>>>>>>>
>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>
>>>>>>>>>>>>     Matt
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>> Mike
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>
>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>>> experiments lead.
>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220429/59db8d76/attachment-0001.html>


More information about the petsc-users mailing list