[petsc-users] Getting a vector from a DM to output VTK

Nicholas Arnold-Medabalimi narnoldm at umich.edu
Mon Dec 26 13:37:26 CST 2022


Hi

Thanks so much for all your help. I've gotten all the core tech working in
C++ and am working on the Fortran integration. Before I do that, I've been
doing some memory checks using Valgrind to ensure everything is acceptable
since I've been seeing random memory corruption errors for specific mesh
sizes. I'm getting an invalid write memory error associated with a
DMCreateGlobalVector call. I presume I have something small in my section
assignment causing this, but I would appreciate any insight.  This is more
or less ripped directly from plex tutorial example 7.

PetscErrorCode addSectionToDM(DM &dm, Vec &state)
{
    int p0, p1, c0, c1;
    DMPlexGetHeightStratum(dm, 0, &c0, &c1);

    DMPlexGetChart(dm, &p0, &p1);
    PetscSection section_full;
    PetscSectionCreate(PETSC_COMM_WORLD, &section_full);
    PetscSectionSetNumFields(section_full, 2);
    PetscSectionSetChart(section_full, p0, p1);
    PetscSectionSetFieldName(section_full, 0, "Pressure");
    PetscSectionSetFieldName(section_full, 1, "Temperature");

    for (int i = c0; i < c1; i++)
    {
        PetscSectionSetDof(section_full, i, 2);
        PetscSectionSetFieldDof(section_full, i, 0, 1);
        PetscSectionSetFieldDof(section_full, i, 1, 1);
    }
    PetscSectionSetUp(section_full);
    DMSetNumFields(dm, 2);
    DMSetLocalSection(dm, section_full);
    PetscSectionDestroy(&section_full);
    DMCreateGlobalVector(dm, &state);

    return 0;
}

results in
==12603== Invalid write of size 8
==12603==    at 0x10CD8B: main (redistribute_filter.cpp:254)
==12603==  Address 0xb9fe800 is 4,816 bytes inside a block of size 4,820
alloc'd
==12603==    at 0x483E0F0: memalign (in
/usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==12603==    by 0x483E212: posix_memalign (in
/usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==12603==    by 0x4C4DAB0: PetscMallocAlign (mal.c:54)
==12603==    by 0x4C5262F: PetscTrMallocDefault (mtr.c:186)
==12603==    by 0x4C501F7: PetscMallocA (mal.c:420)
==12603==    by 0x527E8A9: VecCreate_MPI_Private (pbvec.c:485)
==12603==    by 0x527F04F: VecCreate_MPI (pbvec.c:523)
==12603==    by 0x53E7097: VecSetType (vecreg.c:89)
==12603==    by 0x527FBC8: VecCreate_Standard (pbvec.c:547)
==12603==    by 0x53E7097: VecSetType (vecreg.c:89)
==12603==    by 0x6CD77C0: DMCreateGlobalVector_Section_Private (dmi.c:58)
==12603==    by 0x61D52DB: DMCreateGlobalVector_Plex (plexcreate.c:4130)


Sincerely
Nicholas



On Mon, Dec 26, 2022 at 10:52 AM Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Dec 26, 2022 at 10:49 AM Nicholas Arnold-Medabalimi <
> narnoldm at umich.edu> wrote:
>
>> Oh it's not a worry. I'm debugging this first in C++, and once it's
>> working I don't actually need to view what's happening in Fortran when I
>> move over. In my C debugging code. After I create the distribution vector
>> and distribute the field based on your input, I'm adding
>>
>> VecSetOperation(state_dist, VECOP_VIEW, (void(*)(void))VecView_Plex);
>>
>> But I can't compile it because VecView_Plex is undefined. Thanks
>>
>
> You need
>
> #include <petsc/private/dmpleximpl.h>
>
>   Thanks
>
>      Matt
>
>
>> Sincerely
>> Nicholas
>>
>>
>>
>> On Mon, Dec 26, 2022 at 10:45 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Mon, Dec 26, 2022 at 10:40 AM Nicholas Arnold-Medabalimi <
>>> narnoldm at umich.edu> wrote:
>>>
>>>> Hi Matt
>>>>
>>>> 1) I'm not sure I follow how to call this. If I insert the
>>>> VecSetOperation call I'm not exactly sure what the VecView_Plex is or where
>>>> it is defined?
>>>>
>>>
>>> Shoot, this cannot be done in Fortran. I will rewrite this step to get
>>> it working for you. It should have been done anyway. I cannot do it
>>> today since I have to grade finals, but I should have it this week.
>>>
>>>   Thanks,
>>>
>>>      Matt
>>>
>>>
>>>> 2) Otherwise I've solved this problem with the insight you provided
>>>> into the local section. Things look good on the ASCII output but if we can
>>>> resolve 1 then I think the loop is fully closed and I can just worry about
>>>> the fortran translation.
>>>>
>>>> Thanks again for all your help.
>>>>
>>>> Sincerely
>>>> Nicholas
>>>>
>>>>
>>>>
>>>> On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley <knepley at gmail.com>
>>>> wrote:
>>>>
>>>>> On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
>>>>> narnoldm at umich.edu> wrote:
>>>>>
>>>>>> Hi Matt
>>>>>>
>>>>>> I was able to get this all squared away. It turns out I was
>>>>>> initializing the viewer incorrectly—my mistake. However, there is a
>>>>>> follow-up question. A while back, we discussed distributing a vector field
>>>>>> from an initial DM to a new distributed DM. The way you said to do this was
>>>>>>
>>>>>>     // Distribute the submesh with overlap of 1
>>>>>>     DMPlexDistribute(sub_da, overlap, &distributionSF, &sub_da_dist);
>>>>>>     //Create a vector and section for the distribution
>>>>>>     VecCreate(PETSC_COMM_WORLD, &state_dist);
>>>>>>     VecSetDM(state_dist, sub_da_dist);
>>>>>>     PetscSectionCreate(PETSC_COMM_WORLD, &distSection);
>>>>>>     DMSetLocalSection(sub_da_dist, distSection);
>>>>>>     DMPlexDistributeField(sub_da_dist, distributionSF,
>>>>>> filteredSection, state_filtered, distSection, state_dist);
>>>>>>   I've forgone Fortran to debug this all in C and then integrate the
>>>>>> function calls into the Fortran code.
>>>>>>
>>>>>> There are two questions here.
>>>>>>
>>>>>> 1) How do I associate a vector associated with a DM using VecSetDM to
>>>>>> output properly as a VTK? When I call VecView at present, if I call VecView
>>>>>> on state_dist, it will not output anything.
>>>>>>
>>>>>
>>>>> This is a problem. The different pieces of interface were added at
>>>>> different times. We should really move that manipulation of the function
>>>>> table into VecSetDM(). Here is the code:
>>>>>
>>>>>
>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135
>>>>>
>>>>> You can make the overload call yourself for now, until we decide on
>>>>> the best fix.
>>>>>
>>>>>
>>>>>> 2) The visualization is nice, but when I look at the Vec of the
>>>>>> distributed field using stdout, something isn't distributing correctly, as
>>>>>> the vector still has some uninitialized values. This is apparent if I
>>>>>> output the original vector and the distributed vector. Examining the inside
>>>>>> of DMPlexDistributeField I suspect I'm making a mistake with the sections
>>>>>> I'm passing. filtered section in this case is the global section but if I
>>>>>> try the local section I get an error so I'm not sure.
>>>>>>
>>>>>
>>>>> These should definitely be local sections. Global sections are always
>>>>> built after the fact, and building the global section needs the SF that
>>>>> indicates what points are shared, not the distribution SF that moves
>>>>> points. I need to go back and put in checks that all the arguments are the
>>>>> right type. Thanks for bringing that up. Lets track down the error for
>>>>> local sections.
>>>>>
>>>>>   Matt
>>>>>
>>>>>
>>>>>> *Original Vector(state_filtered)*
>>>>>> Vec Object: Vec_0x84000004_1 2 MPI processes
>>>>>>   type: mpi
>>>>>> Process [0]
>>>>>> 101325.
>>>>>> 300.
>>>>>> 101326.
>>>>>> 301.
>>>>>> 101341.
>>>>>> 316.
>>>>>> Process [1]
>>>>>> 101325.
>>>>>> 300.
>>>>>> 101326.
>>>>>> 301.
>>>>>> 101345.
>>>>>> 320.
>>>>>> 101497.
>>>>>> 472.
>>>>>> 101516.
>>>>>> 491.
>>>>>> *Re-Distributed Vector (state_dist) *
>>>>>> Vec Object: 2 MPI processes
>>>>>>   type: mpi
>>>>>> Process [0]
>>>>>> 101325.
>>>>>> 300.
>>>>>> 101326.
>>>>>> 301.
>>>>>> 101341.
>>>>>> 316.
>>>>>> 7.90505e-323
>>>>>> 1.97626e-323
>>>>>> 4.30765e-312
>>>>>> 6.91179e-310
>>>>>> Process [1]
>>>>>> 101497.
>>>>>> 472.
>>>>>> 101516.
>>>>>> 491.
>>>>>> 1.99665e-314
>>>>>> 8.14714e-321
>>>>>>
>>>>>>
>>>>>> Any insight on distributing this field and correcting the error would
>>>>>> be appreciated.
>>>>>>
>>>>>> Sincerely and Happy Holiday
>>>>>> Nicholas
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley <knepley at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> On Thu, Dec 22, 2022 at 12:41 AM Nicholas Arnold-Medabalimi <
>>>>>>> narnoldm at umich.edu> wrote:
>>>>>>>
>>>>>>>> Hi Petsc Users
>>>>>>>>
>>>>>>>> I've been having trouble consistently getting a vector generated
>>>>>>>> from a DM to output to VTK correctly. I've used ex1.c (which works
>>>>>>>> properly)to try and figure it out, but I'm still having some issues. I must
>>>>>>>> be missing something small that isn't correctly associating the section
>>>>>>>> with the DM.
>>>>>>>>
>>>>>>>>     DMPlexGetChart(dm, &p0, &p1);
>>>>>>>>     PetscSection section_full;
>>>>>>>>     PetscSectionCreate(PETSC_COMM_WORLD, &section_full);
>>>>>>>>     PetscSectionSetNumFields(section_full, 1);
>>>>>>>>     PetscSectionSetChart(section_full, p0, p1);
>>>>>>>>     PetscSectionSetFieldName(section_full, 0, "state");
>>>>>>>>
>>>>>>>>     for (int i = c0; i < c1; i++)
>>>>>>>>     {
>>>>>>>>         PetscSectionSetDof(section_full, i, 1);
>>>>>>>>         PetscSectionSetFieldDof(section_full, i, 0, 1);
>>>>>>>>     }
>>>>>>>>     PetscSectionSetUp(section_full);
>>>>>>>>     DMSetNumFields(dm, 1);
>>>>>>>>     DMSetLocalSection(dm, section_full);
>>>>>>>>     DMCreateGlobalVector(dm, &state_full);
>>>>>>>>
>>>>>>>>     int o0, o1;
>>>>>>>>     VecGetOwnershipRange(state_full, &o0, &o1);
>>>>>>>>     PetscScalar *state_full_array;
>>>>>>>>     VecGetArray(state_full, &state_full_array);
>>>>>>>>
>>>>>>>>     for (int i = 0; i < (c1 - c0); i++)
>>>>>>>>     {
>>>>>>>>         int offset;
>>>>>>>>         PetscSectionGetOffset(section_full, i, &offset);
>>>>>>>>         state_full_array[offset] = 101325 + i;
>>>>>>>>     }
>>>>>>>>
>>>>>>>>     VecRestoreArray(state_full, &state_full_array);
>>>>>>>>
>>>>>>>>
>>>>>>>>     PetscViewerCreate(PETSC_COMM_WORLD, &viewer);
>>>>>>>>     PetscViewerSetType(viewer, PETSCVIEWERVTK);
>>>>>>>>     PetscViewerFileSetMode(viewer, FILE_MODE_WRITE);
>>>>>>>>     PetscViewerFileSetName(viewer, "mesh.vtu");
>>>>>>>>     VecView(state_full, viewer);
>>>>>>>>
>>>>>>>> If I run this mesh.vtu isn't generated at all. If I instead do a
>>>>>>>> DMView passing the DM it will just output the mesh correctly.
>>>>>>>>
>>>>>>>> Any assistance would be greatly appreciated.
>>>>>>>>
>>>>>>>
>>>>>>> DMCreateGlobalVector() dispatches to DMCreateGlobalVector_Plex(),
>>>>>>> which resets the view method to VecView_Plex(), which should dispatch to
>>>>>>> VecView_Plex_Local_VTK(). You can verify this in the debugger, or send us
>>>>>>> code we can run to verify it.
>>>>>>>
>>>>>>>   Thanks,
>>>>>>>
>>>>>>>     Matt
>>>>>>>
>>>>>>>
>>>>>>>> Sincerely
>>>>>>>> Nicholas
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Nicholas Arnold-Medabalimi
>>>>>>>>
>>>>>>>> Ph.D. Candidate
>>>>>>>> Computational Aeroscience Lab
>>>>>>>> University of Michigan
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> What most experimenters take for granted before they begin their
>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>> experiments lead.
>>>>>>> -- Norbert Wiener
>>>>>>>
>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Nicholas Arnold-Medabalimi
>>>>>>
>>>>>> Ph.D. Candidate
>>>>>> Computational Aeroscience Lab
>>>>>> University of Michigan
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>
>>>>
>>>>
>>>> --
>>>> Nicholas Arnold-Medabalimi
>>>>
>>>> Ph.D. Candidate
>>>> Computational Aeroscience Lab
>>>> University of Michigan
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>>
>> --
>> Nicholas Arnold-Medabalimi
>>
>> Ph.D. Candidate
>> Computational Aeroscience Lab
>> University of Michigan
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


-- 
Nicholas Arnold-Medabalimi

Ph.D. Candidate
Computational Aeroscience Lab
University of Michigan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221226/7313df35/attachment-0001.html>


More information about the petsc-users mailing list