[petsc-users] Is it possible to keep track of original elements # after a call to DMPlexDistribute ?

Eric Chamberland Eric.Chamberland at giref.ulaval.ca
Tue Nov 2 21:27:26 CDT 2021


Hi Matthew,

yes, it all works now!

I modified the example to do a NaturalToGlobal then a GlobalToNatural 
and test if we have the good result...

The test would just need that we "force" the computed partition to a 
given result, then we could check the result of the NaturalToGlobal 
independently of the partitioner...

Thanks a lot!

Eric


On 2021-11-02 5:16 a.m., Matthew Knepley wrote:
> On Mon, Nov 1, 2021 at 5:24 PM Eric Chamberland 
> <Eric.Chamberland at giref.ulaval.ca 
> <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>
>     Hi Matthew,
>
>     ok, good news then! :)
>
>     Meanwhile, I tested it the "other way" with
>     DMPlexGlobalToNaturalBegin/End and it does not give a "sensed"
>     result neither.
>
>     Probably it does something like "GlobalToSequential" too?
>
> I am not sure how it worked before, but I think I have it fixed. I get 
> the output you expect for the test. I force-pushed the branch
> so you will have to check it out again.
>
>   Thanks!
>
>      Matt
>
>     Thanks!
>
>     Eric
>
>
>     On 2021-11-01 3:25 p.m., Matthew Knepley wrote:
>>     On Sun, Oct 31, 2021 at 10:07 AM Eric Chamberland
>>     <Eric.Chamberland at giref.ulaval.ca
>>     <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>>
>>         Hi Matthew,
>>
>>         we do not know if DMPlexNaturalToGlobalBegin/End is buggy or
>>         if it is our comprehension of what it should do that is wrong...
>>
>>         Would you just check if what we try to do from line 313 to
>>         356 is good or wrong?
>>
>>         The expected result is that the global vector "lGlobalVec"
>>         contains the element numbers from the initial partition that
>>         have been put into "lNatVec".
>>
>>         Thanks a lot for any insights!!
>>
>>     Okay, I have found the problem.I confess to not actually looking
>>     very closely at the implementation before. GlobalToNatural was
>>     assuming that
>>     the "natural" was also sequential, and looking at it all the
>>     tests are eq-to-parallel distribution. I need to fix it for a
>>     parallel natural state. Hopefully
>>     it will not take me long.
>>
>>       Thanks,
>>
>>          Matt
>>
>>         Eric
>>
>>
>>         On 2021-10-27 2:32 p.m., Eric Chamberland wrote:
>>>
>>>         Hi Matthew,
>>>
>>>         we continued the example.  Now it must be our misuse of
>>>         PETSc that produced the wrong result.
>>>
>>>         As stated into the code:
>>>
>>>         // The call to DMPlexNaturalToGlobalBegin/End does not
>>>         produce our expected result...
>>>           // In lGlobalVec, we expect to have:
>>>           /*
>>>            * Process [0]
>>>            * 2.
>>>            * 4.
>>>            * 8.
>>>            * 3.
>>>            * 9.
>>>            * Process [1]
>>>            * 1.
>>>            * 5.
>>>            * 7.
>>>            * 0.
>>>            * 6.
>>>            *
>>>            * but we obtained:
>>>            *
>>>            * Process [0]
>>>            * 2.
>>>            * 4.
>>>            * 8.
>>>            * 0.
>>>            * 0.
>>>            * Process [1]
>>>            * 0.
>>>            * 0.
>>>            * 0.
>>>            * 0.
>>>            * 0.
>>>            */
>>>
>>>         (see attached ex44.c)
>>>
>>>         Thanks,
>>>
>>>         Eric
>>>
>>>         On 2021-10-27 1:25 p.m., Eric Chamberland wrote:
>>>>
>>>>         Great!
>>>>
>>>>         Thanks Matthew, it is working for me up to that point!
>>>>
>>>>         We are continuing the ex44.c and forward it to you at the
>>>>         next blocking point...
>>>>
>>>>         Eric
>>>>
>>>>         On 2021-10-27 11:14 a.m., Matthew Knepley wrote:
>>>>>         On Wed, Oct 27, 2021 at 8:29 AM Eric Chamberland
>>>>>         <Eric.Chamberland at giref.ulaval.ca
>>>>>         <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>>>>>
>>>>>             Hi Matthew,
>>>>>
>>>>>             the smallest mesh which crashes the code is a 2x5 mesh:
>>>>>
>>>>>             See the modified ex44.c
>>>>>
>>>>>             With smaller meshes(2x2, 2x4, etc), it passes...  But
>>>>>             it bugs latter when I try to use
>>>>>             DMPlexNaturalToGlobalBegin but let's keep that other
>>>>>             problem for later...
>>>>>
>>>>>             Thanks a lot for helping digging into this! :)
>>>>>
>>>>>         I have made a small fix in this branch
>>>>>
>>>>>         https://gitlab.com/petsc/petsc/-/commits/knepley/fix-plex-g2n
>>>>>         <https://gitlab.com/petsc/petsc/-/commits/knepley/fix-plex-g2n>
>>>>>
>>>>>         It seems to run for me. Can you check it?
>>>>>
>>>>>           Thanks,
>>>>>
>>>>>              Matt
>>>>>
>>>>>             Eric
>>>>>
>>>>>             (sorry if you received this for a  2nd times, I have
>>>>>             trouble with my mail)
>>>>>
>>>>>             On 2021-10-26 4:35 p.m., Matthew Knepley wrote:
>>>>>>             On Tue, Oct 26, 2021 at 1:35 PM Eric Chamberland
>>>>>>             <Eric.Chamberland at giref.ulaval.ca
>>>>>>             <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>>>>>>
>>>>>>                 Here is a screenshot of the partition I hard
>>>>>>                 coded (top) and vertices/element numbers (down):
>>>>>>
>>>>>>                 I have not yet modified the ex44.c example to
>>>>>>                 properly assign the coordinates...
>>>>>>
>>>>>>                 (but I would not have done it like it is in the
>>>>>>                 last version because the sCoords array is the
>>>>>>                 global array with global vertices number)
>>>>>>
>>>>>>                 I will have time to do this tomorrow...
>>>>>>
>>>>>>                 Maybe I can first try to reproduce all this with
>>>>>>                 a smaller mesh?
>>>>>>
>>>>>>
>>>>>>             That might make it easier to find a problem.
>>>>>>
>>>>>>               Thanks!
>>>>>>
>>>>>>                  Matt
>>>>>>
>>>>>>                 Eric
>>>>>>
>>>>>>                 On 2021-10-26 9:46 a.m., Matthew Knepley wrote:
>>>>>>>                 Okay, I ran it. Something seems off with the
>>>>>>>                 mesh. First, I cannot simply explain the
>>>>>>>                 partition. The number of shared vertices and edges
>>>>>>>                 does not seem to come from a straight cut.
>>>>>>>                 Second, the mesh look scrambled on output.
>>>>>>>
>>>>>>>                   Thanks,
>>>>>>>
>>>>>>>                     Matt
>>>>>>>
>>>>>>>                 On Sun, Oct 24, 2021 at 11:49 PM Eric
>>>>>>>                 Chamberland <Eric.Chamberland at giref.ulaval.ca
>>>>>>>                 <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>>>>>>>
>>>>>>>                     Hi Matthew,
>>>>>>>
>>>>>>>                     ok, I started back from your ex44.c example
>>>>>>>                     and added the global array of coordinates. I
>>>>>>>                     just have to code the creation of the local
>>>>>>>                     coordinates now.
>>>>>>>
>>>>>>>                     Eric
>>>>>>>
>>>>>>>                     On 2021-10-20 6:55 p.m., Matthew Knepley wrote:
>>>>>>>>                     On Wed, Oct 20, 2021 at 3:06 PM Eric
>>>>>>>>                     Chamberland
>>>>>>>>                     <Eric.Chamberland at giref.ulaval.ca
>>>>>>>>                     <mailto:Eric.Chamberland at giref.ulaval.ca>>
>>>>>>>>                     wrote:
>>>>>>>>
>>>>>>>>                         Hi Matthew,
>>>>>>>>
>>>>>>>>                         we tried to reproduce the error in a
>>>>>>>>                         simple example.
>>>>>>>>
>>>>>>>>                         The context is the following: We hard
>>>>>>>>                         coded the mesh and initial partition
>>>>>>>>                         into the code (see sConnectivity and
>>>>>>>>                         sInitialPartition) for 2 ranks and try
>>>>>>>>                         to create a section in order to use the
>>>>>>>>                         DMPlexNaturalToGlobalBegin function to
>>>>>>>>                         retreive our initial element numbers.
>>>>>>>>
>>>>>>>>                         Now the call to DMPlexDistribute give
>>>>>>>>                         different errors depending on what type
>>>>>>>>                         of component we ask the field to be
>>>>>>>>                         created.  For our objective, we would
>>>>>>>>                         like a global field to be created on
>>>>>>>>                         elements only (like a P0 interpolation).
>>>>>>>>
>>>>>>>>                         We now have the following error generated:
>>>>>>>>
>>>>>>>>                         [0]PETSC ERROR: ---------------------
>>>>>>>>                         Error Message
>>>>>>>>                         --------------------------------------------------------------
>>>>>>>>                         [0]PETSC ERROR: Petsc has generated
>>>>>>>>                         inconsistent data
>>>>>>>>                         [0]PETSC ERROR: Inconsistency in
>>>>>>>>                         indices, 18 should be 17
>>>>>>>>                         [0]PETSC ERROR: See
>>>>>>>>                         https://www.mcs.anl.gov/petsc/documentation/faq.html
>>>>>>>>                         <https://www.mcs.anl.gov/petsc/documentation/faq.html>
>>>>>>>>                         for trouble shooting.
>>>>>>>>                         [0]PETSC ERROR: Petsc Release Version
>>>>>>>>                         3.15.0, Mar 30, 2021
>>>>>>>>                         [0]PETSC ERROR: ./bug on a  named rohan
>>>>>>>>                         by ericc Wed Oct 20 14:52:36 2021
>>>>>>>>                         [0]PETSC ERROR: Configure options
>>>>>>>>                         --prefix=/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7
>>>>>>>>                         --with-mpi-compilers=1
>>>>>>>>                         --with-mpi-dir=/opt/openmpi-4.1.0_gcc7
>>>>>>>>                         --with-cxx-dialect=C++14
>>>>>>>>                         --with-make-np=12
>>>>>>>>                         --with-shared-libraries=1
>>>>>>>>                         --with-debugging=yes --with-memalign=64
>>>>>>>>                         --with-visibility=0
>>>>>>>>                         --with-64-bit-indices=0
>>>>>>>>                         --download-ml=yes --download-mumps=yes
>>>>>>>>                         --download-superlu=yes
>>>>>>>>                         --download-hpddm=yes
>>>>>>>>                         --download-slepc=yes
>>>>>>>>                         --download-superlu_dist=yes
>>>>>>>>                         --download-parmetis=yes
>>>>>>>>                         --download-ptscotch=yes
>>>>>>>>                         --download-metis=yes
>>>>>>>>                         --download-strumpack=yes
>>>>>>>>                         --download-suitesparse=yes
>>>>>>>>                         --download-hypre=yes
>>>>>>>>                         --with-blaslapack-dir=/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
>>>>>>>>                         --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
>>>>>>>>                         --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
>>>>>>>>                         --with-scalapack=1
>>>>>>>>                         --with-scalapack-include=/opt/intel/oneapi/mkl/2021.1.1/env/../include
>>>>>>>>                         --with-scalapack-lib="-L/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
>>>>>>>>                         -lmkl_scalapack_lp64
>>>>>>>>                         -lmkl_blacs_openmpi_lp64"
>>>>>>>>                         [0]PETSC ERROR: #1
>>>>>>>>                         PetscSFCreateSectionSF() at
>>>>>>>>                         /tmp/ompi-opt/petsc-3.15.0-debug/src/vec/is/sf/utils/sfutils.c:409
>>>>>>>>                         [0]PETSC ERROR: #2
>>>>>>>>                         DMPlexCreateGlobalToNaturalSF() at
>>>>>>>>                         /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexnatural.c:184
>>>>>>>>                         [0]PETSC ERROR: #3 DMPlexDistribute()
>>>>>>>>                         at
>>>>>>>>                         /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexdistribute.c:1733
>>>>>>>>                         [0]PETSC ERROR: #4 main() at
>>>>>>>>                         bug_section.cc:159
>>>>>>>>                         [0]PETSC ERROR: No PETSc Option Table
>>>>>>>>                         entries
>>>>>>>>                         [0]PETSC ERROR: ----------------End of
>>>>>>>>                         Error Message -------send entire error
>>>>>>>>                         message to petsc-maint at mcs.anl.gov
>>>>>>>>                         <mailto:petsc-maint at mcs.anl.gov>----------
>>>>>>>>
>>>>>>>>                         Hope the attached code is
>>>>>>>>                         self-explaining, note that to make it
>>>>>>>>                         short, we have not included the final
>>>>>>>>                         part of it, just the buggy part we are
>>>>>>>>                         encountering right now...
>>>>>>>>
>>>>>>>>                         Thanks for your insights,
>>>>>>>>
>>>>>>>>                     Thanks for making the example. I tweaked it
>>>>>>>>                     slightly. I put in a test case that just
>>>>>>>>                     makes a parallel 7 x 10 quad mesh. This works
>>>>>>>>                     fine. Thus I think it must be something
>>>>>>>>                     connected with the original mesh. It is
>>>>>>>>                     hard to get a handle on it without the
>>>>>>>>                     coordinates.
>>>>>>>>                     Do you think you could put the coordinate
>>>>>>>>                     array in? I have added the code to load
>>>>>>>>                     them (see attached file).
>>>>>>>>
>>>>>>>>                       Thanks,
>>>>>>>>
>>>>>>>>                      Matt
>>>>>>>>
>>>>>>>>                         Eric
>>>>>>>>
>>>>>>>>                         On 2021-10-06 9:23 p.m., Matthew
>>>>>>>>                         Knepley wrote:
>>>>>>>>>                         On Wed, Oct 6, 2021 at 5:43 PM Eric
>>>>>>>>>                         Chamberland
>>>>>>>>>                         <Eric.Chamberland at giref.ulaval.ca
>>>>>>>>>                         <mailto:Eric.Chamberland at giref.ulaval.ca>>
>>>>>>>>>                         wrote:
>>>>>>>>>
>>>>>>>>>                             Hi Matthew,
>>>>>>>>>
>>>>>>>>>                             we tried to use that. Now, we
>>>>>>>>>                             discovered that:
>>>>>>>>>
>>>>>>>>>                             1- even if we "ask" for sfNatural
>>>>>>>>>                             creation with DMSetUseNatural, it
>>>>>>>>>                             is not created because
>>>>>>>>>                             DMPlexCreateGlobalToNaturalSF
>>>>>>>>>                             looks for a "section": this is not
>>>>>>>>>                             documented in DMSetUseNaturalso we
>>>>>>>>>                             are asking ourselfs: "is this a
>>>>>>>>>                             permanent feature or a temporary
>>>>>>>>>                             situation?"
>>>>>>>>>
>>>>>>>>>                         I think explaining this will help
>>>>>>>>>                         clear up a lot.
>>>>>>>>>
>>>>>>>>>                         What the Natural2Global map does is
>>>>>>>>>                         permute a solution vector into the
>>>>>>>>>                         ordering that it would have had prior
>>>>>>>>>                         to mesh distribution.
>>>>>>>>>                         Now, in order to do this permutation,
>>>>>>>>>                         I need to know the original (global)
>>>>>>>>>                         data layout. If it is not specified
>>>>>>>>>                         _before_ distribution, we
>>>>>>>>>                         cannot build the permutation. The
>>>>>>>>>                         section describes the data layout, so
>>>>>>>>>                         I need it before distribution.
>>>>>>>>>
>>>>>>>>>                         I cannot think of another way that you
>>>>>>>>>                         would implement this, but if you want
>>>>>>>>>                         something else, let me know.
>>>>>>>>>
>>>>>>>>>                             2- We then tried to create a
>>>>>>>>>                             "section" in different manners: we
>>>>>>>>>                             took the code into the example
>>>>>>>>>                             petsc/src/dm/impls/plex/tests/ex15.c.
>>>>>>>>>                             However, we ended up with a segfault:
>>>>>>>>>
>>>>>>>>>                             corrupted size vs. prev_size
>>>>>>>>>                             [rohan:07297] *** Process received
>>>>>>>>>                             signal ***
>>>>>>>>>                             [rohan:07297] Signal: Aborted (6)
>>>>>>>>>                             [rohan:07297] Signal code: (-6)
>>>>>>>>>                             [rohan:07297] [ 0]
>>>>>>>>>                             /lib64/libpthread.so.0(+0x13f80)[0x7f6f13be3f80]
>>>>>>>>>                             [rohan:07297] [ 1]
>>>>>>>>>                             /lib64/libc.so.6(gsignal+0x10b)[0x7f6f109b718b]
>>>>>>>>>                             [rohan:07297] [ 2]
>>>>>>>>>                             /lib64/libc.so.6(abort+0x175)[0x7f6f109b8585]
>>>>>>>>>                             [rohan:07297] [ 3]
>>>>>>>>>                             /lib64/libc.so.6(+0x7e2f7)[0x7f6f109fb2f7]
>>>>>>>>>                             [rohan:07297] [ 4]
>>>>>>>>>                             /lib64/libc.so.6(+0x857ea)[0x7f6f10a027ea]
>>>>>>>>>                             [rohan:07297] [ 5]
>>>>>>>>>                             /lib64/libc.so.6(+0x86036)[0x7f6f10a03036]
>>>>>>>>>                             [rohan:07297] [ 6]
>>>>>>>>>                             /lib64/libc.so.6(+0x861a3)[0x7f6f10a031a3]
>>>>>>>>>                             [rohan:07297] [ 7]
>>>>>>>>>                             /lib64/libc.so.6(+0x88740)[0x7f6f10a05740]
>>>>>>>>>                             [rohan:07297] [ 8]
>>>>>>>>>                             /lib64/libc.so.6(__libc_malloc+0x1b8)[0x7f6f10a070c8]
>>>>>>>>>                             [rohan:07297] [ 9]
>>>>>>>>>                             /lib64/libc.so.6(__backtrace_symbols+0x134)[0x7f6f10a8b064]
>>>>>>>>>                             [rohan:07297] [10]
>>>>>>>>>                             /home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z12reqBacktraceRNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE+0x4e)[0x4538ce]
>>>>>>>>>                             [rohan:07297] [11]
>>>>>>>>>                             /home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z15attacheDebuggerv+0x120)[0x4523c0]
>>>>>>>>>                             [rohan:07297] [12]
>>>>>>>>>                             /home/mefpp_ericc/GIREF/lib/libgiref_dev_Util.so(traitementSignal+0x612)[0x7f6f28f503a2]
>>>>>>>>>                             [rohan:07297] [13]
>>>>>>>>>                             /lib64/libc.so.6(+0x3a210)[0x7f6f109b7210]
>>>>>>>>>
>>>>>>>>>                             [rohan:07297] [14]
>>>>>>>>>                             /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscTrMallocDefault+0x6fd)[0x7f6f22f1b8ed]
>>>>>>>>>                             [rohan:07297] [15]
>>>>>>>>>                             /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscMallocA+0x5cd)[0x7f6f22f19c2d]
>>>>>>>>>                             [rohan:07297] [16]
>>>>>>>>>                             /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscSFCreateSectionSF+0xb48)[0x7f6f23268e18]
>>>>>>>>>                             [rohan:07297] [17]
>>>>>>>>>                             /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexCreateGlobalToNaturalSF+0x13b2)[0x7f6f241a5602]
>>>>>>>>>                             [rohan:07297] [18]
>>>>>>>>>                             /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexDistribute+0x39b1)[0x7f6f23fdca21]
>>>>>>>>>
>>>>>>>>>                         I am not sure what happened here, but
>>>>>>>>>                         if you could send a sample code, I
>>>>>>>>>                         will figure it out.
>>>>>>>>>
>>>>>>>>>                             If we do not create a section, the
>>>>>>>>>                             call to DMPlexDistribute is
>>>>>>>>>                             successful, but
>>>>>>>>>                             DMPlexGetGlobalToNaturalSF return
>>>>>>>>>                             a null SF pointer...
>>>>>>>>>
>>>>>>>>>                         Yes, it just ignores it in this case
>>>>>>>>>                         because it does not have a global layout.
>>>>>>>>>
>>>>>>>>>                             Here are the operations we are
>>>>>>>>>                             calling ( this is almost the code
>>>>>>>>>                             we are using, I just removed
>>>>>>>>>                             verifications and creation of the
>>>>>>>>>                             connectivity which use our
>>>>>>>>>                             parallel structure and code):
>>>>>>>>>
>>>>>>>>>                             ===========
>>>>>>>>>
>>>>>>>>>                               PetscInt* lCells      = 0;
>>>>>>>>>                               PetscInt lNumCorners = 0;
>>>>>>>>>                               PetscInt lDimMail    = 0;
>>>>>>>>>                               PetscInt lnumCells   = 0;
>>>>>>>>>
>>>>>>>>>                               //At this point we create the
>>>>>>>>>                             cells for PETSc expected input for
>>>>>>>>>                             DMPlexBuildFromCellListParallel
>>>>>>>>>                             and set lNumCorners, lDimMail and
>>>>>>>>>                             lnumCells to correct values.
>>>>>>>>>                               ...
>>>>>>>>>
>>>>>>>>>                               DM lDMBete = 0
>>>>>>>>>                             DMPlexCreate(lMPIComm,&lDMBete);
>>>>>>>>>
>>>>>>>>>                             DMSetDimension(lDMBete, lDimMail);
>>>>>>>>>
>>>>>>>>>                             DMPlexBuildFromCellListParallel(lDMBete,
>>>>>>>>>                                                              
>>>>>>>>>                             lnumCells,
>>>>>>>>>                                                              
>>>>>>>>>                             PETSC_DECIDE,
>>>>>>>>>                             pLectureElementsLocaux.reqNbTotalSommets(),
>>>>>>>>>                                                              
>>>>>>>>>                             lNumCorners,
>>>>>>>>>                                                              
>>>>>>>>>                             lCells,
>>>>>>>>>                                                              
>>>>>>>>>                             PETSC_NULL);
>>>>>>>>>
>>>>>>>>>                               DM lDMBeteInterp = 0;
>>>>>>>>>                             DMPlexInterpolate(lDMBete,
>>>>>>>>>                             &lDMBeteInterp);
>>>>>>>>>                             DMDestroy(&lDMBete);
>>>>>>>>>                               lDMBete = lDMBeteInterp;
>>>>>>>>>
>>>>>>>>>                             DMSetUseNatural(lDMBete,PETSC_TRUE);
>>>>>>>>>
>>>>>>>>>                               PetscSF lSFMigrationSansOvl = 0;
>>>>>>>>>                               PetscSF lSFMigrationOvl = 0;
>>>>>>>>>                               DM lDMDistribueSansOvl = 0;
>>>>>>>>>                               DM lDMAvecOverlap = 0;
>>>>>>>>>
>>>>>>>>>                             PetscPartitioner lPart;
>>>>>>>>>                             DMPlexGetPartitioner(lDMBete, &lPart);
>>>>>>>>>                             PetscPartitionerSetFromOptions(lPart);
>>>>>>>>>
>>>>>>>>>                             PetscSection section;
>>>>>>>>>                             PetscInt numFields   = 1;
>>>>>>>>>                             PetscInt numBC       = 0;
>>>>>>>>>                             PetscInt numComp[1]  = {1};
>>>>>>>>>                             PetscInt numDof[4]   = {1, 0, 0, 0};
>>>>>>>>>                             PetscInt bcFields[1] = {0};
>>>>>>>>>                             IS bcPoints[1] = {NULL};
>>>>>>>>>
>>>>>>>>>                             DMSetNumFields(lDMBete, numFields);
>>>>>>>>>
>>>>>>>>>                             DMPlexCreateSection(lDMBete, NULL,
>>>>>>>>>                             numComp, numDof, numBC, bcFields,
>>>>>>>>>                             bcPoints, NULL, NULL, &section);
>>>>>>>>>                             DMSetLocalSection(lDMBete, section);
>>>>>>>>>
>>>>>>>>>                             DMPlexDistribute(lDMBete, 0,
>>>>>>>>>                             &lSFMigrationSansOvl,
>>>>>>>>>                             &lDMDistribueSansOvl); // segfault!
>>>>>>>>>
>>>>>>>>>                             ===========
>>>>>>>>>
>>>>>>>>>                             So we have other question/remarks:
>>>>>>>>>
>>>>>>>>>                             3- Maybe PETSc expect something
>>>>>>>>>                             specific that is missing/not
>>>>>>>>>                             verified: for example, we didn't
>>>>>>>>>                             gave any coordinates since we just
>>>>>>>>>                             want to partition and compute
>>>>>>>>>                             overlap for the mesh... and then
>>>>>>>>>                             recover our element numbers in a
>>>>>>>>>                             "simple way"
>>>>>>>>>
>>>>>>>>>                             4- We are telling ourselves it is
>>>>>>>>>                             somewhat a "big price to pay" to
>>>>>>>>>                             have to build an unused section to
>>>>>>>>>                             have the global to natural
>>>>>>>>>                             ordering set ?  Could this
>>>>>>>>>                             requirement be avoided?
>>>>>>>>>
>>>>>>>>>                         I don't think so. There would have to
>>>>>>>>>                         be _some_ way of describing your data
>>>>>>>>>                         layout in terms of mesh points, and I
>>>>>>>>>                         do not see how you could use less
>>>>>>>>>                         memory doing that.
>>>>>>>>>
>>>>>>>>>                             5- Are there any improvement
>>>>>>>>>                             towards our usages in 3.16 release?
>>>>>>>>>
>>>>>>>>>                         Let me try and run the code above.
>>>>>>>>>
>>>>>>>>>                           Thanks,
>>>>>>>>>
>>>>>>>>>                          Matt
>>>>>>>>>
>>>>>>>>>                             Thanks,
>>>>>>>>>
>>>>>>>>>                             Eric
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                             On 2021-09-29 7:39 p.m., Matthew
>>>>>>>>>                             Knepley wrote:
>>>>>>>>>>                             On Wed, Sep 29, 2021 at 5:18 PM
>>>>>>>>>>                             Eric Chamberland
>>>>>>>>>>                             <Eric.Chamberland at giref.ulaval.ca
>>>>>>>>>>                             <mailto:Eric.Chamberland at giref.ulaval.ca>>
>>>>>>>>>>                             wrote:
>>>>>>>>>>
>>>>>>>>>>                                 Hi,
>>>>>>>>>>
>>>>>>>>>>                                 I come back with _almost_ the
>>>>>>>>>>                                 original question:
>>>>>>>>>>
>>>>>>>>>>                                 I would like to add an
>>>>>>>>>>                                 integer information (*our*
>>>>>>>>>>                                 original element
>>>>>>>>>>                                 number, not petsc one) on
>>>>>>>>>>                                 each element of the DMPlex I
>>>>>>>>>>                                 create with
>>>>>>>>>>                                 DMPlexBuildFromCellListParallel.
>>>>>>>>>>
>>>>>>>>>>                                 I would like this interger to
>>>>>>>>>>                                 be distribruted by or the
>>>>>>>>>>                                 same way
>>>>>>>>>>                                 DMPlexDistribute distribute
>>>>>>>>>>                                 the mesh.
>>>>>>>>>>
>>>>>>>>>>                                 Is it possible to do this?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>                             I think we already have support
>>>>>>>>>>                             for what you want. If you call
>>>>>>>>>>
>>>>>>>>>>                             https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html
>>>>>>>>>>                             <https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html>
>>>>>>>>>>
>>>>>>>>>>                             before DMPlexDistribute(), it
>>>>>>>>>>                             will compute a PetscSF encoding
>>>>>>>>>>                             the global to natural map. You
>>>>>>>>>>                             can get it with
>>>>>>>>>>
>>>>>>>>>>                             https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html
>>>>>>>>>>                             <https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html>
>>>>>>>>>>
>>>>>>>>>>                             and use it with
>>>>>>>>>>
>>>>>>>>>>                             https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html
>>>>>>>>>>                             <https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html>
>>>>>>>>>>
>>>>>>>>>>                             Is this sufficient?
>>>>>>>>>>
>>>>>>>>>>                               Thanks,
>>>>>>>>>>
>>>>>>>>>>                                  Matt
>>>>>>>>>>
>>>>>>>>>>                                 Thanks,
>>>>>>>>>>
>>>>>>>>>>                                 Eric
>>>>>>>>>>
>>>>>>>>>>                                 On 2021-07-14 1:18 p.m., Eric
>>>>>>>>>>                                 Chamberland wrote:
>>>>>>>>>>                                 > Hi,
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > I want to use
>>>>>>>>>>                                 DMPlexDistribute from PETSc
>>>>>>>>>>                                 for computing overlapping
>>>>>>>>>>                                 > and play with the different
>>>>>>>>>>                                 partitioners supported.
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > However, after calling
>>>>>>>>>>                                 DMPlexDistribute, I noticed
>>>>>>>>>>                                 the elements are
>>>>>>>>>>                                 > renumbered and then the
>>>>>>>>>>                                 original number is lost.
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > What would be the best way
>>>>>>>>>>                                 to keep track of the element
>>>>>>>>>>                                 renumbering?
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > a) Adding an optional
>>>>>>>>>>                                 parameter to let the user
>>>>>>>>>>                                 retrieve a vector or
>>>>>>>>>>                                 > "IS" giving the old number?
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > b) Adding a DMLabel (seems
>>>>>>>>>>                                 a wrong good solution)
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > c) Other idea?
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > Of course, I don't want to
>>>>>>>>>>                                 loose performances with the
>>>>>>>>>>                                 need of this
>>>>>>>>>>                                 > "mapping"...
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > Thanks,
>>>>>>>>>>                                 >
>>>>>>>>>>                                 > Eric
>>>>>>>>>>                                 >
>>>>>>>>>>                                 -- 
>>>>>>>>>>                                 Eric Chamberland, ing., M. Ing
>>>>>>>>>>                                 Professionnel de recherche
>>>>>>>>>>                                 GIREF/Université Laval
>>>>>>>>>>                                 (418) 656-2131 poste 41 22 42
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>                             -- 
>>>>>>>>>>                             What most experimenters take for
>>>>>>>>>>                             granted before they begin their
>>>>>>>>>>                             experiments is infinitely more
>>>>>>>>>>                             interesting than any results to
>>>>>>>>>>                             which their experiments lead.
>>>>>>>>>>                             -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>>                             https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>>                             <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>
>>>>>>>>>                             -- 
>>>>>>>>>                             Eric Chamberland, ing., M. Ing
>>>>>>>>>                             Professionnel de recherche
>>>>>>>>>                             GIREF/Université Laval
>>>>>>>>>                             (418) 656-2131 poste 41 22 42
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                         -- 
>>>>>>>>>                         What most experimenters take for
>>>>>>>>>                         granted before they begin their
>>>>>>>>>                         experiments is infinitely more
>>>>>>>>>                         interesting than any results to which
>>>>>>>>>                         their experiments lead.
>>>>>>>>>                         -- Norbert Wiener
>>>>>>>>>
>>>>>>>>>                         https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>                         <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>
>>>>>>>>                         -- 
>>>>>>>>                         Eric Chamberland, ing., M. Ing
>>>>>>>>                         Professionnel de recherche
>>>>>>>>                         GIREF/Université Laval
>>>>>>>>                         (418) 656-2131 poste 41 22 42
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>                     -- 
>>>>>>>>                     What most experimenters take for granted
>>>>>>>>                     before they begin their experiments is
>>>>>>>>                     infinitely more interesting than any
>>>>>>>>                     results to which their experiments lead.
>>>>>>>>                     -- Norbert Wiener
>>>>>>>>
>>>>>>>>                     https://www.cse.buffalo.edu/~knepley/
>>>>>>>>                     <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>
>>>>>>>                     -- 
>>>>>>>                     Eric Chamberland, ing., M. Ing
>>>>>>>                     Professionnel de recherche
>>>>>>>                     GIREF/Université Laval
>>>>>>>                     (418) 656-2131 poste 41 22 42
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>                 -- 
>>>>>>>                 What most experimenters take for granted before
>>>>>>>                 they begin their experiments is infinitely more
>>>>>>>                 interesting than any results to which their
>>>>>>>                 experiments lead.
>>>>>>>                 -- Norbert Wiener
>>>>>>>
>>>>>>>                 https://www.cse.buffalo.edu/~knepley/
>>>>>>>                 <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>>                 -- 
>>>>>>                 Eric Chamberland, ing., M. Ing
>>>>>>                 Professionnel de recherche
>>>>>>                 GIREF/Université Laval
>>>>>>                 (418) 656-2131 poste 41 22 42
>>>>>>
>>>>>>
>>>>>>
>>>>>>             -- 
>>>>>>             What most experimenters take for granted before they
>>>>>>             begin their experiments is infinitely more
>>>>>>             interesting than any results to which their
>>>>>>             experiments lead.
>>>>>>             -- Norbert Wiener
>>>>>>
>>>>>>             https://www.cse.buffalo.edu/~knepley/
>>>>>>             <http://www.cse.buffalo.edu/~knepley/>
>>>>>
>>>>>             -- 
>>>>>             Eric Chamberland, ing., M. Ing
>>>>>             Professionnel de recherche
>>>>>             GIREF/Université Laval
>>>>>             (418) 656-2131 poste 41 22 42
>>>>>
>>>>>
>>>>>
>>>>>         -- 
>>>>>         What most experimenters take for granted before they begin
>>>>>         their experiments is infinitely more interesting than any
>>>>>         results to which their experiments lead.
>>>>>         -- Norbert Wiener
>>>>>
>>>>>         https://www.cse.buffalo.edu/~knepley/
>>>>>         <http://www.cse.buffalo.edu/~knepley/>
>>>>         -- 
>>>>         Eric Chamberland, ing., M. Ing
>>>>         Professionnel de recherche
>>>>         GIREF/Université Laval
>>>>         (418) 656-2131 poste 41 22 42
>>>         -- 
>>>         Eric Chamberland, ing., M. Ing
>>>         Professionnel de recherche
>>>         GIREF/Université Laval
>>>         (418) 656-2131 poste 41 22 42
>>
>>         -- 
>>         Eric Chamberland, ing., M. Ing
>>         Professionnel de recherche
>>         GIREF/Université Laval
>>         (418) 656-2131 poste 41 22 42
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.cse.buffalo.edu/~knepley/>
>
>     -- 
>     Eric Chamberland, ing., M. Ing
>     Professionnel de recherche
>     GIREF/Université Laval
>     (418) 656-2131 poste 41 22 42
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
> <http://www.cse.buffalo.edu/~knepley/>

-- 
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211102/47d3402d/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: hbnbhlbilhmjdpfg.png
Type: image/png
Size: 42972 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211102/47d3402d/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: eejjfmbjimlkboec.png
Type: image/png
Size: 87901 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211102/47d3402d/attachment-0003.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex44.c
Type: text/x-csrc
Size: 13682 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211102/47d3402d/attachment-0001.bin>


More information about the petsc-users mailing list