[petsc-users] local/global DMPlex Vec output

Matteo Semplice matteo.semplice at uninsubria.it
Thu Jan 30 11:41:53 CST 2025


Dear Matt

Il 25/10/24 01:02, Matthew Knepley ha scritto:
> On Thu, Oct 24, 2024 at 6:04 PM Matteo Semplice 
> <matteo.semplice at uninsubria.it> wrote:
>
>     Hi. The HDF5 solution looks good to me, but I now get this error
>
> Okay, I can make a workaround for this. Here is what is happening.
>
> When you output solutions, you really want the essential boundary 
> conditions included in the
> output, and the only way I have to do that is for you to tell me about 
> the discretization, so I
> require the DS.
>
> What I can do is ignore this step if there is no DS. Let me do that 
> and mail you with the branch.

Sorry for the long delay, but now I am taking this up again.

In my code I compute all fields in all cells and I have no boundary 
conditions. I tried to just call DMCreateDS on the mesh, but the error 
does not change. I guess that I need to do a minimal setup on the DS to 
have it working.

If you have some time, could you create the branch you mentioned in your 
message or tell me what is the minimal action I have to do to setup a 
"fake" ds? Whichever is the best for you is fine with me.

If you need more info, in my Section I have 3 fields attached to cells. 
I have also another dm with a field attached to vertices, but this 
latter is less important and I don't really need it in output for 
production runs.

Thanks

     Matteo

>
>   Thanks!
>
>      Matt
>
>     $ ../src/saveDemo
>     Creating mesh with (10,10) faces
>     [0]PETSC ERROR: --------------------- Error Message
>     --------------------------------------------------------------
>     [0]PETSC ERROR: Object is in wrong state
>     [0]PETSC ERROR: Need to call DMCreateDS() before calling DMGetDS()
>     [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjVPxoR1-Q$  for trouble
>     shooting.
>     [0]PETSC ERROR: Petsc Release Version 3.21.5, unknown
>     [0]PETSC ERROR: ../src/saveDemo on a  named dentdherens by matteo
>     Fri Oct 25 00:00:44 2024
>     [0]PETSC ERROR: Configure options --COPTFLAGS="-O3 -march=native
>     -mtune=native -mavx2" --CXXOPTFLAGS="-O3 -march=native
>     -mtune=native -mavx2" --FOPTFLAGS="-O3 -march=native -mtune=native
>     -mavx2" --PETSC_ARCH=op
>     t --with-strict-petscerrorcode --download-hdf5
>     --prefix=/home/matteo/software/petsc/3.21-opt/ --with-debugging=0
>     --with-gmsh --with-metis --with-parmetis --with-triangle
>     PETSC_DIR=/home/matteo/software/petsc --
>     force
>     [0]PETSC ERROR: #1 DMGetDS() at
>     /home/matteo/software/petsc/src/dm/interface/dm.c:5525
>     [0]PETSC ERROR: #2 DMPlexInsertBoundaryValues_Plex() at
>     /home/matteo/software/petsc/src/dm/impls/plex/plexfem.c:1136
>     [0]PETSC ERROR: #3 DMPlexInsertBoundaryValues() at
>     /home/matteo/software/petsc/src/dm/impls/plex/plexfem.c:1274
>     [0]PETSC ERROR: #4 VecView_Plex_HDF5_Internal() at
>     /home/matteo/software/petsc/src/dm/impls/plex/plexhdf5.c:477
>     [0]PETSC ERROR: #5 VecView_Plex() at
>     /home/matteo/software/petsc/src/dm/impls/plex/plex.c:656
>     [0]PETSC ERROR: #6 VecView() at
>     /home/matteo/software/petsc/src/vec/vec/interface/vector.c:806
>     [0]PETSC ERROR: #7 main() at saveDemo.cpp:123
>     [0]PETSC ERROR: No PETSc Option Table entries
>     [0]PETSC ERROR: ----------------End of Error Message -------send
>     entire error message to petsc-maint at mcs.anl.gov----------
>     --------------------------------------------------------------------------
>
>     MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
>     with errorcode 73.
>
>     NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>     You may or may not see output from other processes, depending on
>     exactly when Open MPI kills them.
>     --------------------------------------------------------------------------
>
>     I attach the modified sample code that produced the above error.
>
>     Thanks
>
>         Matteo
>
>     Il 24/10/24 22:20, Matthew Knepley ha scritto:
>>     I just looked at the code. The VTK code is very old, and does not
>>     check for cell overlap.
>>
>>     We have been recommending that people use either HDF5 or CGNS,
>>     both of which work in this case
>>     I believe. I can fix VTK if that is what you want, but it might
>>     take me a little while as it is very busy at
>>     work right now. However, if you output HDF5, then you can run
>>
>>       ./lib/petsc/bin/petsc_gen_xdmf.py mesh.h5
>>
>>     and it will generate an XDMF file so you can load it into
>>     ParaView. Or you can output CGNS which I think
>>     ParaView understands.
>>
>>       Thanks,
>>
>>          Matt
>>
>>     On Thu, Oct 24, 2024 at 4:02 PM Semplice Matteo
>>     <matteo.semplice at uninsubria.it> wrote:
>>
>>         Hi,
>>               I tried again today and have (re?)discovered this
>>         example
>>         https://urldefense.us/v3/__https://petsc.org/release/src/dm/impls/plex/tutorials/ex14.c.html__;!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjVM1k_vVQ$ ,
>>         but I cannot understand if in my case I should call
>>         PetscSFCreateSectionSF
>>         <https://urldefense.us/v3/__https://petsc.org/release/manualpages/PetscSF/PetscSFCreateSectionSF/__;!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjXAq_GDrQ$ > and,
>>         if so, how should I then activate the returned SF.
>>         Matteo
>>         ------------------------------------------------------------------------
>>         *Da:* Semplice Matteo
>>         *Inviato:* martedì 22 ottobre 2024 00:24
>>         *A:* Matthew Knepley <knepley at gmail.com>
>>         *Cc:* PETSc <petsc-users at mcs.anl.gov>
>>         *Oggetto:* Re: [petsc-users] local/global DMPlex Vec output
>>
>>         Dear Matt,
>>
>>             I guess you're right: thresholding by rank==0 and rank==1
>>         in paraview reveals that it is indeed the overlap cells that
>>         are appear twice in the output.
>>
>>         The attached file is not exactly minimal but hopefully short
>>         enough. If I run it in serial, all is ok, but with
>>
>>             mpirun -np 2 ./saveDemo
>>
>>         it creates a 10x10 grid, but I get "output.vtu" with a total
>>         of 120 cells. However the pointSF of the DMPlex seems correct.
>>
>>         Thanks
>>
>>             Matteo
>>
>>         Il 21/10/24 19:15, Matthew Knepley ha scritto:
>>>         On Mon, Oct 21, 2024 at 12:22 PM Matteo Semplice via
>>>         petsc-users <petsc-users at mcs.anl.gov> wrote:
>>>
>>>             Dear petsc-users,
>>>
>>>                 I am having issues with output of parallel data
>>>             attached to a DMPlex (or maybe more fundamental ones
>>>             about DMPlex...).
>>>
>>>             So I currently
>>>
>>>              1. create a DMPlex (DMPlexCreateGmshFromFile or
>>>                 DMPlexCreateBoxMesh)
>>>              2. partition it
>>>              3. and create a section for my data layout with
>>>                 DMPlexCreateSection(ctx.dmMesh, NULL, numComp,
>>>                 numDof, numBC, NULL, NULL, NULL, NULL, &sUavg)
>>>              4. DMSetLocalSection(ctx.dmMesh, sUavg)
>>>              5. create solLoc and solGlob vectors with
>>>                 DMCreateGlobalVector and DMCreateLocalVector
>>>              6. solve ....
>>>              7. VecView(ctx.solGlob, vtkViewer) on a .vtu file
>>>
>>>             but when I load data in ParaView I get more cells than
>>>             expected and it is as if the cells in the halo are put
>>>             twice in output. (I could create a MWE if the above is
>>>             not clear)
>>>
>>>         I think we need an MWE here, because from the explanation
>>>         above, it should work.
>>>
>>>         However, I can try to guess the problem. When you partition
>>>         the mesh, I am guessing that you have cells in the overlap.
>>>         These cells
>>>         must be in the point SF in order for the global section to
>>>         give them a unique owner. Perhaps something has gone wrong here.
>>>
>>>           Thanks,
>>>
>>>              Matt
>>>
>>>             I guess that the culprit is point (4), but if I replace
>>>             it with DMSetGlobalSection then I cannot create the
>>>             local vector at point (5).
>>>
>>>             How should I handle this properly? In my code I need to
>>>             create both local and global vectors, to perform at
>>>             least GlobalToLocal and to save the global data.
>>>
>>>             (On a side note, I tried also HDF5 but then it complains
>>>             about the DM not having a DS...; really, any working
>>>             solution that allows data to be explored with Paraview
>>>             is fine)
>>>
>>>             Thanks for any advice!
>>>
>>>             Matteo Semplice
>>>
>>>
>>>
>>>         -- 
>>>         What most experimenters take for granted before they begin
>>>         their experiments is infinitely more interesting than any
>>>         results to which their experiments lead.
>>>         -- Norbert Wiener
>>>
>>>         https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjVfVZa3zQ$ 
>>>         <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjUhSnQjTQ$ >
>>
>>         -- 
>>         ---
>>         Professore Associato in Analisi Numerica
>>         Dipartimento di Scienza e Alta Tecnologia
>>         Università degli Studi dell'Insubria
>>         Via Valleggio, 11 - Como
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjVfVZa3zQ$ 
>>     <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjUhSnQjTQ$ >
>
>     -- 
>     ---
>     Professore Associato in Analisi Numerica
>     Dipartimento di Scienza e Alta Tecnologia
>     Università degli Studi dell'Insubria
>     Via Valleggio, 11 - Como
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjVfVZa3zQ$  
> <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!f3xcC-P6ZKjh4GKu4oz9-xo0-soAZzH97UszdXrhFrCLfp14uEixVvm8ME-t8J1qTOJY50bopUUkRuerZImYSYMcYOyMJjUhSnQjTQ$ >

-- 
---
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250130/a92fada8/attachment-0001.html>


More information about the petsc-users mailing list