<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
  </head>
  <body>
    <p>Hi Matthew,</p>
    <p>we tried to reproduce the error in a simple example.</p>
    <p>The context is the following: We hard coded the mesh and initial
      partition into the code (see sConnectivity and sInitialPartition)
      for 2 ranks and try to create a section in order to use the
      DMPlexNaturalToGlobalBegin function to retreive our initial
      element numbers.</p>
    <p>Now the call to DMPlexDistribute give different errors depending
      on what type of component we ask the field to be created.  For our
      objective, we would like a global field to be created on elements
      only (like a P0 interpolation).<br>
    </p>
    <p>We now have the following error generated:</p>
    <p>[0]PETSC ERROR: --------------------- Error Message
      --------------------------------------------------------------<br>
      [0]PETSC ERROR: Petsc has generated inconsistent data<br>
      [0]PETSC ERROR: Inconsistency in indices, 18 should be 17<br>
      [0]PETSC ERROR: See
      <a class="moz-txt-link-freetext" href="https://www.mcs.anl.gov/petsc/documentation/faq.html">https://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
      shooting.<br>
      [0]PETSC ERROR: Petsc Release Version 3.15.0, Mar 30, 2021 <br>
      [0]PETSC ERROR: ./bug on a  named rohan by ericc Wed Oct 20
      14:52:36 2021<br>
      [0]PETSC ERROR: Configure options
      --prefix=/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7
      --with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.1.0_gcc7
      --with-cxx-dialect=C++14 --with-make-np=12
      --with-shared-libraries=1 --with-debugging=yes --with-memalign=64
      --with-visibility=0 --with-64-bit-indices=0 --download-ml=yes
      --download-mumps=yes --download-superlu=yes --download-hpddm=yes
      --download-slepc=yes --download-superlu_dist=yes
      --download-parmetis=yes --download-ptscotch=yes
      --download-metis=yes --download-strumpack=yes
      --download-suitesparse=yes --download-hypre=yes
      --with-blaslapack-dir=/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
      --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
      --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
      --with-scalapack=1
      --with-scalapack-include=/opt/intel/oneapi/mkl/2021.1.1/env/../include
--with-scalapack-lib="-L/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
      -lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"<br>
      [0]PETSC ERROR: #1 PetscSFCreateSectionSF() at
      /tmp/ompi-opt/petsc-3.15.0-debug/src/vec/is/sf/utils/sfutils.c:409<br>
      [0]PETSC ERROR: #2 DMPlexCreateGlobalToNaturalSF() at
      /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexnatural.c:184<br>
      [0]PETSC ERROR: #3 DMPlexDistribute() at
      /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexdistribute.c:1733<br>
      [0]PETSC ERROR: #4 main() at bug_section.cc:159<br>
      [0]PETSC ERROR: No PETSc Option Table entries<br>
      [0]PETSC ERROR: ----------------End of Error Message -------send
      entire error message to <a class="moz-txt-link-abbreviated" href="mailto:petsc-maint@mcs.anl.gov">petsc-maint@mcs.anl.gov</a>----------<br>
      <br>
    </p>
    <p>Hope the attached code is self-explaining, note that to make it
      short, we have not included the final part of it, just the buggy
      part we are encountering right now...</p>
    <p>Thanks for your insights,</p>
    <p>Eric<br>
    </p>
    <div class="moz-cite-prefix">On 2021-10-06 9:23 p.m., Matthew
      Knepley wrote:<br>
    </div>
    <blockquote type="cite"
cite="mid:CAMYG4G=kJeKo=Yqk2fjPk=30j0+Eq-96K=PRSJqrx7K2ScEufg@mail.gmail.com">
      <meta http-equiv="content-type" content="text/html; charset=UTF-8">
      <div dir="ltr">
        <div dir="ltr">On Wed, Oct 6, 2021 at 5:43 PM Eric Chamberland
          <<a href="mailto:Eric.Chamberland@giref.ulaval.ca"
            moz-do-not-send="true">Eric.Chamberland@giref.ulaval.ca</a>>
          wrote:<br>
        </div>
        <div class="gmail_quote">
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p>Hi Matthew,</p>
              <p>we tried to use that.  Now, we discovered that:</p>
              <p>1- even if we "ask" for sfNatural creation with
                DMSetUseNatural, it is not created because
                DMPlexCreateGlobalToNaturalSF looks for a "section":
                this is not documented in DMSetUseNaturalso we are
                asking ourselfs: "is this a permanent feature or a
                temporary situation?"<br>
              </p>
            </div>
          </blockquote>
          <div>I think explaining this will help clear up a lot.</div>
          <div><br>
          </div>
          <div>What the Natural2Global map does is permute a solution
            vector into the ordering that it would have had prior to
            mesh distribution.</div>
          <div>Now, in order to do this permutation, I need to know the
            original (global) data layout. If it is not specified
            _before_ distribution, we</div>
          <div>cannot build the permutation.  The section describes the
            data layout, so I need it before distribution.</div>
          <div><br>
          </div>
          <div>I cannot think of another way that you would implement
            this, but if you want something else, let me know.</div>
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p>2- We then tried to create a "section" in different
                manners: we took the code into the example
                petsc/src/dm/impls/plex/tests/ex15.c.  However, we ended
                up with a segfault:</p>
              <p>corrupted size vs. prev_size<br>
                [rohan:07297] *** Process received signal ***<br>
                [rohan:07297] Signal: Aborted (6)<br>
                [rohan:07297] Signal code:  (-6)<br>
                [rohan:07297] [ 0]
                /lib64/libpthread.so.0(+0x13f80)[0x7f6f13be3f80]<br>
                [rohan:07297] [ 1]
                /lib64/libc.so.6(gsignal+0x10b)[0x7f6f109b718b]<br>
                [rohan:07297] [ 2]
                /lib64/libc.so.6(abort+0x175)[0x7f6f109b8585]<br>
                [rohan:07297] [ 3]
                /lib64/libc.so.6(+0x7e2f7)[0x7f6f109fb2f7]<br>
                [rohan:07297] [ 4]
                /lib64/libc.so.6(+0x857ea)[0x7f6f10a027ea]<br>
                [rohan:07297] [ 5]
                /lib64/libc.so.6(+0x86036)[0x7f6f10a03036]<br>
                [rohan:07297] [ 6]
                /lib64/libc.so.6(+0x861a3)[0x7f6f10a031a3]<br>
                [rohan:07297] [ 7]
                /lib64/libc.so.6(+0x88740)[0x7f6f10a05740]<br>
                [rohan:07297] [ 8]
                /lib64/libc.so.6(__libc_malloc+0x1b8)[0x7f6f10a070c8]<br>
                [rohan:07297] [ 9]
                /lib64/libc.so.6(__backtrace_symbols+0x134)[0x7f6f10a8b064]<br>
                [rohan:07297] [10]
/home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z12reqBacktraceRNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE+0x4e)[0x4538ce]<br>
                [rohan:07297] [11]
/home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z15attacheDebuggerv+0x120)[0x4523c0]<br>
                [rohan:07297] [12]
/home/mefpp_ericc/GIREF/lib/libgiref_dev_Util.so(traitementSignal+0x612)[0x7f6f28f503a2]<br>
                [rohan:07297] [13]
                /lib64/libc.so.6(+0x3a210)[0x7f6f109b7210]</p>
              <p>[rohan:07297] [14]
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscTrMallocDefault+0x6fd)[0x7f6f22f1b8ed]<br>
                [rohan:07297] [15]
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscMallocA+0x5cd)[0x7f6f22f19c2d]<br>
                [rohan:07297] [16]
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscSFCreateSectionSF+0xb48)[0x7f6f23268e18]<br>
                [rohan:07297] [17]
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexCreateGlobalToNaturalSF+0x13b2)[0x7f6f241a5602]<br>
                [rohan:07297] [18]
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexDistribute+0x39b1)[0x7f6f23fdca21]<br>
              </p>
            </div>
          </blockquote>
          <div>I am not sure what happened here, but if you could send a
            sample code, I will figure it out. </div>
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p>If we do not create a section, the call to
                DMPlexDistribute is successful, but
                DMPlexGetGlobalToNaturalSF return a null SF pointer...<br>
              </p>
            </div>
          </blockquote>
          <div>Yes, it just ignores it in this case because it does not
            have a global layout. </div>
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p> </p>
              <p>Here are the operations we are calling ( this is almost
                the code we are using, I just removed verifications and
                creation of the connectivity which use our parallel
                structure and code):<br>
              </p>
              <p>===========</p>
              <p>  PetscInt* lCells      = 0;<br>
                  PetscInt  lNumCorners = 0;<br>
                  PetscInt  lDimMail    = 0;<br>
                  PetscInt  lnumCells   = 0;<br>
                <br>
                  //At this point we create the cells for PETSc expected
                input for DMPlexBuildFromCellListParallel and set
                lNumCorners, lDimMail and lnumCells to correct values.<br>
                  ...<br>
                  <br>
                  DM       lDMBete = 0<br>
                  DMPlexCreate(lMPIComm,&lDMBete);<br>
                <br>
                  DMSetDimension(lDMBete, lDimMail);<br>
                <br>
                  DMPlexBuildFromCellListParallel(lDMBete,<br>
                                                  lnumCells,<br>
                                                  PETSC_DECIDE,<br>
                                                 
                pLectureElementsLocaux.reqNbTotalSommets(),<br>
                                                  lNumCorners,<br>
                                                  lCells,<br>
                                                  PETSC_NULL);<br>
                <br>
                  DM lDMBeteInterp = 0;<br>
                  DMPlexInterpolate(lDMBete, &lDMBeteInterp);<br>
                  DMDestroy(&lDMBete);<br>
                  lDMBete = lDMBeteInterp;<br>
                <br>
                  DMSetUseNatural(lDMBete,PETSC_TRUE);<br>
                <br>
                  PetscSF lSFMigrationSansOvl = 0;<br>
                  PetscSF lSFMigrationOvl = 0;<br>
                  DM lDMDistribueSansOvl = 0;<br>
                  DM lDMAvecOverlap = 0;<br>
                <br>
                  PetscPartitioner lPart;<br>
                  DMPlexGetPartitioner(lDMBete, &lPart);<br>
                  PetscPartitionerSetFromOptions(lPart);<br>
                <br>
                  PetscSection   section;<br>
                  PetscInt       numFields   = 1;<br>
                  PetscInt       numBC       = 0;<br>
                  PetscInt       numComp[1]  = {1};<br>
                  PetscInt       numDof[4]   = {1, 0, 0, 0};<br>
                  PetscInt       bcFields[1] = {0};<br>
                  IS             bcPoints[1] = {NULL};<br>
                <br>
                  DMSetNumFields(lDMBete, numFields);<br>
                <br>
                  DMPlexCreateSection(lDMBete, NULL, numComp, numDof,
                numBC, bcFields, bcPoints, NULL, NULL, &section);<br>
                  DMSetLocalSection(lDMBete, section);<br>
                <br>
                  DMPlexDistribute(lDMBete, 0, &lSFMigrationSansOvl,
                &lDMDistribueSansOvl); // segfault!<br>
              </p>
              <p>===========<br>
              </p>
              <p>So we have other question/remarks:</p>
              <p>3- Maybe PETSc expect something specific that is
                missing/not verified: for example, we didn't gave any
                coordinates since we just want to partition and compute
                overlap for the mesh... and then recover our element
                numbers in a "simple way"</p>
              <p>4- We are telling ourselves it is somewhat a "big price
                to pay" to have to build an unused section to have the
                global to natural ordering set ?  Could this requirement
                be avoided?</p>
            </div>
          </blockquote>
          <div>I don't think so. There would have to be _some_ way of
            describing your data layout in terms of mesh points, and I
            do not see how you could use less memory doing that. </div>
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p>5- Are there any improvement towards our usages in 3.16
                release?</p>
            </div>
          </blockquote>
          <div>Let me try and run the code above.</div>
          <div><br>
          </div>
          <div>  Thanks,</div>
          <div><br>
          </div>
          <div>     Matt </div>
          <blockquote class="gmail_quote" style="margin:0px 0px 0px
            0.8ex;border-left:1px solid
            rgb(204,204,204);padding-left:1ex">
            <div>
              <p>Thanks,</p>
              <p>Eric</p>
              <p><br>
              </p>
              <div>On 2021-09-29 7:39 p.m., Matthew Knepley wrote:<br>
              </div>
              <blockquote type="cite">
                <div dir="ltr">
                  <div dir="ltr">On Wed, Sep 29, 2021 at 5:18 PM Eric
                    Chamberland <<a
                      href="mailto:Eric.Chamberland@giref.ulaval.ca"
                      target="_blank" moz-do-not-send="true">Eric.Chamberland@giref.ulaval.ca</a>>
                    wrote:<br>
                  </div>
                  <div class="gmail_quote">
                    <blockquote class="gmail_quote" style="margin:0px
                      0px 0px 0.8ex;border-left:1px solid
                      rgb(204,204,204);padding-left:1ex">Hi,<br>
                      <br>
                      I come back with _almost_ the original question:<br>
                      <br>
                      I would like to add an integer information (*our*
                      original element <br>
                      number, not petsc one) on each element of the
                      DMPlex I create with <br>
                      DMPlexBuildFromCellListParallel.<br>
                      <br>
                      I would like this interger to be distribruted by
                      or the same way <br>
                      DMPlexDistribute distribute the mesh.<br>
                      <br>
                      Is it possible to do this?<br>
                    </blockquote>
                    <div><br>
                    </div>
                    <div>I think we already have support for what you
                      want. If you call</div>
                    <div><br>
                    </div>
                    <div>  <a
                        href="https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html"
                        target="_blank" moz-do-not-send="true">https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html</a></div>
                    <div><br>
                    </div>
                    <div>before DMPlexDistribute(), it will compute a
                      PetscSF encoding the global to natural map. You</div>
                    <div>can get it with</div>
                    <div><br>
                    </div>
                    <div>  <a
href="https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html"
                        target="_blank" moz-do-not-send="true">https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html</a></div>
                    <div><br>
                    </div>
                    <div>and use it with</div>
                    <div><br>
                    </div>
                    <div>  <a
href="https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html"
                        target="_blank" moz-do-not-send="true">https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html</a></div>
                    <div><br>
                    </div>
                    <div>Is this sufficient?</div>
                    <div><br>
                    </div>
                    <div>  Thanks,</div>
                    <div><br>
                    </div>
                    <div>     Matt</div>
                    <div> </div>
                    <blockquote class="gmail_quote" style="margin:0px
                      0px 0px 0.8ex;border-left:1px solid
                      rgb(204,204,204);padding-left:1ex"> Thanks,<br>
                      <br>
                      Eric<br>
                      <br>
                      On 2021-07-14 1:18 p.m., Eric Chamberland wrote:<br>
                      > Hi,<br>
                      ><br>
                      > I want to use DMPlexDistribute from PETSc for
                      computing overlapping <br>
                      > and play with the different partitioners
                      supported.<br>
                      ><br>
                      > However, after calling DMPlexDistribute, I
                      noticed the elements are <br>
                      > renumbered and then the original number is
                      lost.<br>
                      ><br>
                      > What would be the best way to keep track of
                      the element renumbering?<br>
                      ><br>
                      > a) Adding an optional parameter to let the
                      user retrieve a vector or <br>
                      > "IS" giving the old number?<br>
                      ><br>
                      > b) Adding a DMLabel (seems a wrong good
                      solution)<br>
                      ><br>
                      > c) Other idea?<br>
                      ><br>
                      > Of course, I don't want to loose performances
                      with the need of this <br>
                      > "mapping"...<br>
                      ><br>
                      > Thanks,<br>
                      ><br>
                      > Eric<br>
                      ><br>
                      -- <br>
                      Eric Chamberland, ing., M. Ing<br>
                      Professionnel de recherche<br>
                      GIREF/Université Laval<br>
                      (418) 656-2131 poste 41 22 42<br>
                      <br>
                    </blockquote>
                  </div>
                  <br clear="all">
                  <div><br>
                  </div>
                  -- <br>
                  <div dir="ltr">
                    <div dir="ltr">
                      <div>
                        <div dir="ltr">
                          <div>
                            <div dir="ltr">
                              <div>What most experimenters take for
                                granted before they begin their
                                experiments is infinitely more
                                interesting than any results to which
                                their experiments lead.<br>
                                -- Norbert Wiener</div>
                              <div><br>
                              </div>
                              <div><a
                                  href="http://www.cse.buffalo.edu/~knepley/"
                                  target="_blank" moz-do-not-send="true">https://www.cse.buffalo.edu/~knepley/</a><br>
                              </div>
                            </div>
                          </div>
                        </div>
                      </div>
                    </div>
                  </div>
                </div>
              </blockquote>
              <pre cols="72">-- 
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42</pre>
            </div>
          </blockquote>
        </div>
        <br clear="all">
        <div><br>
        </div>
        -- <br>
        <div dir="ltr" class="gmail_signature">
          <div dir="ltr">
            <div>
              <div dir="ltr">
                <div>
                  <div dir="ltr">
                    <div>What most experimenters take for granted before
                      they begin their experiments is infinitely more
                      interesting than any results to which their
                      experiments lead.<br>
                      -- Norbert Wiener</div>
                    <div><br>
                    </div>
                    <div><a href="http://www.cse.buffalo.edu/~knepley/"
                        target="_blank" moz-do-not-send="true">https://www.cse.buffalo.edu/~knepley/</a><br>
                    </div>
                  </div>
                </div>
              </div>
            </div>
          </div>
        </div>
      </div>
    </blockquote>
    <pre class="moz-signature" cols="72">-- 
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42</pre>
  </body>
</html>