[petsc-users] R: local/global DMPlex Vec output
Semplice Matteo
matteo.semplice at uninsubria.it
Thu Oct 24 15:02:10 CDT 2024
Hi,
I tried again today and have (re?)discovered this example https://urldefense.us/v3/__https://petsc.org/release/src/dm/impls/plex/tutorials/ex14.c.html__;!!G_uCfscf7eWS!fJpyioZNC3oN15Qm8-PTF_nxD_NYc-Ou_-moVFROjwZyZREL7e0PwP-rDwjCcynulDAeLGL1WqkN7tyuZepLZ0L7BfI88y7favtI0g$ , but I cannot understand if in my case I should call PetscSFCreateSectionSF<https://urldefense.us/v3/__https://petsc.org/release/manualpages/PetscSF/PetscSFCreateSectionSF/__;!!G_uCfscf7eWS!fJpyioZNC3oN15Qm8-PTF_nxD_NYc-Ou_-moVFROjwZyZREL7e0PwP-rDwjCcynulDAeLGL1WqkN7tyuZepLZ0L7BfI88y41h-wz6w$ > and, if so, how should I then activate the returned SF.
Matteo
________________________________
Da: Semplice Matteo
Inviato: martedì 22 ottobre 2024 00:24
A: Matthew Knepley <knepley at gmail.com>
Cc: PETSc <petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] local/global DMPlex Vec output
Dear Matt,
I guess you're right: thresholding by rank==0 and rank==1 in paraview reveals that it is indeed the overlap cells that are appear twice in the output.
The attached file is not exactly minimal but hopefully short enough. If I run it in serial, all is ok, but with
mpirun -np 2 ./saveDemo
it creates a 10x10 grid, but I get "output.vtu" with a total of 120 cells. However the pointSF of the DMPlex seems correct.
Thanks
Matteo
Il 21/10/24 19:15, Matthew Knepley ha scritto:
On Mon, Oct 21, 2024 at 12:22 PM Matteo Semplice via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Dear petsc-users,
I am having issues with output of parallel data attached to a DMPlex (or maybe more fundamental ones about DMPlex...).
So I currently
1. create a DMPlex (DMPlexCreateGmshFromFile or DMPlexCreateBoxMesh)
2. partition it
3. and create a section for my data layout with DMPlexCreateSection(ctx.dmMesh, NULL, numComp, numDof, numBC, NULL, NULL, NULL, NULL, &sUavg)
4. DMSetLocalSection(ctx.dmMesh, sUavg)
5. create solLoc and solGlob vectors with DMCreateGlobalVector and DMCreateLocalVector
6. solve ....
7. VecView(ctx.solGlob, vtkViewer) on a .vtu file
but when I load data in ParaView I get more cells than expected and it is as if the cells in the halo are put twice in output. (I could create a MWE if the above is not clear)
I think we need an MWE here, because from the explanation above, it should work.
However, I can try to guess the problem. When you partition the mesh, I am guessing that you have cells in the overlap. These cells
must be in the point SF in order for the global section to give them a unique owner. Perhaps something has gone wrong here.
Thanks,
Matt
I guess that the culprit is point (4), but if I replace it with DMSetGlobalSection then I cannot create the local vector at point (5).
How should I handle this properly? In my code I need to create both local and global vectors, to perform at least GlobalToLocal and to save the global data.
(On a side note, I tried also HDF5 but then it complains about the DM not having a DS...; really, any working solution that allows data to be explored with Paraview is fine)
Thanks for any advice!
Matteo Semplice
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fJpyioZNC3oN15Qm8-PTF_nxD_NYc-Ou_-moVFROjwZyZREL7e0PwP-rDwjCcynulDAeLGL1WqkN7tyuZepLZ0L7BfI88y7jE-NoBQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fJpyioZNC3oN15Qm8-PTF_nxD_NYc-Ou_-moVFROjwZyZREL7e0PwP-rDwjCcynulDAeLGL1WqkN7tyuZepLZ0L7BfI88y4eE76fNw$ >
--
---
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241024/0f3c046a/attachment.html>
More information about the petsc-users
mailing list