[petsc-users] Parallel DMPlex

Matthew Knepley knepley at gmail.com
Tue Oct 10 19:33:18 CDT 2023


On Tue, Oct 10, 2023 at 7:01 PM erdemguer <erdemguer at proton.me> wrote:

>
> Hi,
> Sorry for my late response. I tried with your suggestions and I think I
> made a progress. But I still got issues. Let me explain my latest mesh
> routine:
>
>
>    1. DMPlexCreateBoxMesh
>    2. DMSetFromOptions
>    3. PetscSectionCreate
>    4. PetscSectionSetNumFields
>    5. PetscSectionSetFieldDof
>    6. PetscSectionSetDof
>    7. PetscSectionSetUp
>    8. DMSetLocalSection
>    9. DMSetAdjacency
>    10. DMPlexDistribute
>
>
> It's still not working but it's promising, if I call DMPlexGetDepthStratum
> for cells, I can see that after distribution processors have more cells.
>

Please send the output of DMPlexView() for each incarnation of the mesh.
What I do is put

  DMViewFromOptions(dm, NULL, "-dm1_view")

with a different string after each call.


> But I couldn't figure out how to decide where the ghost/processor boundary
> cells start.
>

Please send the actual code because the above is not specific enough. For
example, you will not have
"ghost cells" unless you partition with overlap. This is because by default
cells are the partitioned quantity,
so each process gets a unique set.

  Thanks,

      Matt


> In older mails I saw there is a function DMPlexGetHybridBounds but I
> think that function is deprecated. I tried to use,
> DMPlexGetCellTypeStratum as in ts/tutorials/ex11_sa.c but I'm getting -1
> as cEndInterior before and after distribution. I tried it for
> DM_POLYTOPE_FV_GHOST, DM_POLYTOPE_INTERIOR_GHOST polytope types. I also
> tried calling DMPlexComputeCellTypes before DMPlexGetCellTypeStratum but
> nothing changed. I think I can calculate the ghost cell indices using
> cStart/cEnd before & after distribution but I think there is a better way
> I'm currently missing.
>
> Thanks again,
> Guer.
>
> ------- Original Message -------
> On Thursday, September 28th, 2023 at 10:42 PM, Matthew Knepley <
> knepley at gmail.com> wrote:
>
> On Thu, Sep 28, 2023 at 3:38 PM erdemguer via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Hi,
>>
>> I am currently using DMPlex in my code. It runs serially at the moment,
>> but I'm interested in adding parallel options. Here is my workflow:
>>
>> Create a DMPlex mesh from GMSH.
>> Reorder it with DMPlexPermute.
>> Create necessary pre-processing arrays related to the mesh/problem.
>> Create field(s) with multi-dofs.
>> Create residual vectors.
>> Define a function to calculate the residual for each cell and, use SNES.
>> As you can see, I'm not using FV or FE structures (most examples do).
>> Now, I'm trying to implement this in parallel using a similar approach.
>> However, I'm struggling to understand how to create corresponding vectors
>> and how to obtain index sets for each processor. Is there a tutorial or
>> paper that covers this topic?
>>
>
> The intention was that there is enough information in the manual to do
> this.
>
> Using PetscFE/PetscFV is not required. However, I strongly encourage you
> to use PetscSection. Without this, it would be incredibly hard to do what
> you want. Once the DM has a Section, it can do things like automatically
> create vectors and matrices for you. It can redistribute them, subset them,
> etc. The Section describes how dofs are assigned to pieces of the mesh
> (mesh points). This is in the manual, and there are a few examples that do
> it by hand.
>
> So I suggest changing your code to use PetscSection, and then letting us
> know if things still do not work.
>
> Thanks,
>
> Matt
>
>> Thank you.
>> Guer.
>>
>> Sent with Proton Mail <https://proton.me/> secure email.
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231010/85943b92/attachment.html>


More information about the petsc-users mailing list