[petsc-users] Advices on creating DMPlex from custom input format

onur.notonur onur.notonur at proton.me
Tue Oct 31 04:59:27 CDT 2023


Dear Matt and Jed,

Thank you so much for your insights.

Jed, as far as I know, the format is custom internal structure. I will double-check this. If it is used outside, I'm more than willing to contribute the reader.

Best,

Onur

Sent with [Proton Mail](https://proton.me/) secure email.

------- Original Message -------
On Monday, October 30th, 2023 at 8:16 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Oct 30, 2023 at 5:37 AM onur.notonur via petsc-users <petsc-users at mcs.anl.gov> wrote:
>
>> Hi,
>>
>> I hope this message finds you all in good health and high spirits.
>>
>> I wanted to discuss an approach problem input file reading/processing in a solver which is using PETSc DMPlex. In our team we have a range of solvers, they are not built on PETSc except this one, but they all share a common problem input format. This format includes essential data such as node coordinates, element connectivity, boundary conditions based on elements, and specific metadata related to the problem. I create an array for boundary points on each rank and utilize them in our computations, I am doing it hardcoded currently but I need to start reading those input files, But I am not sure about the approach.
>>
>> Here's what I have in mind:
>>
>> - - Begin by reading the node coordinates and connectivity on a single core.
>> - Utilize the DMPlexCreateFromCellListPetsc() function to construct the DMPlex.
>> - Distribute the mesh across processors.
>> - Proceed to read and process the boundary conditions on each processor. If the global index of the boundary element corresponds to that processor, we process it; otherwise, we pass.
>>
>> Additionally, maybe I need to reorder the mesh. In that case I think I can use the point permutation IS obtained from the DMPlexGetOrdering() function while processing boundary conditions.
>>
>> Also I have another approach in my mind but I don't know if it's possible: Read/construct DMPlex on single core including boundary conditions. Store BC related data in Vec or another appropriate data structure. Then distribute this BC holding data structure too as well as DMPlex.
>
> This is by far the easier approach. If you do not have meshes that are too big to load in serial, I would do
> this. Here is what you do:
>
> - Read in the mesh onto 1 process
> - Mark the boundary conditions, probably with a DMLabel
> - Make a Section over the mesh indicating what data you have for BC
> - Create a Vec from this Section and fill it with boundary values (DMCreateGlobalVector)
> - Distribute the mesh, and keep the point SF (DMPlexDIstribute)
> - Create a BC SF from the points SF (PetscSFCreateSectionSF)
> - DIstribute the BC values using the BC SF (PetscSFBcast)
>
> Thanks,
>
> Matt
>
>> I would greatly appreciate your thoughts and any suggestions you might have regarding this approach. Looking forward to hearing your insights.
>>
>> Best regards,
>>
>> Onur
>
> --
>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> [https://www.cse.buffalo.edu/~knepley/](http://www.cse.buffalo.edu/~knepley/)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231031/e9852cbe/attachment-0001.html>


More information about the petsc-users mailing list