[petsc-users] TR: [petsc-dev] DMPlexCreateGlobalToNaturalSF and partitioners
Alexis Marboeuf
alexis.marboeuf at hotmail.fr
Thu Jan 6 16:51:38 CST 2022
Hi Matt,
First, I wish a Happy New Year 2022 to you and to the whole PETSc developper team.
Waiting I can push my branch on the remote repository (I am going to post on petsc-dev just after), the error in DMPlexCreateGlobalToNaturalSF with parmetis or ptscotch partitionners raises with src/dm/impls/plex/tests/ex44.c on the branch knepley/fix-plex-g2n. I just defined 2 fields instead of 1 in ex44.c:
const PetscInt Nf = 2;
const PetscInt numComp[2] = {1, dim};
const PetscInt numDof[6] = {1, 0, 0, 0, 0, dim};
on lines 284 to 286. I get the error
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Input array needs to be sorted
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.1-235-g50d0f7a46d GIT Date: 2021-12-19 19:19:22 -0500
[0]PETSC ERROR: ../ex44 on a arch-darwin-c-debug named marboeua-1.math.mcmaster.ca by alexismarboeuf Thu Jan 6 17:23:54 2022
[0]PETSC ERROR: Configure options --download-chaco=1 --download-exodusii=1 --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 --download-ml=1 --download-netcdf=1 --download-pnetcdf=1 --download-sieve=1 --download-sowing=1 --download-yaml=1 --download-zlib=1 --download-metis=1 --download-parmetis=1 --download-ptscotch=1 --with-boost-dir=/opt/homebrew/Cellar/boost/1.76.0 --with-boost=1 --with-c2html=0 --with-debugging=1 --with-fortran-datatypes=1 --with-mpi-dir=/opt/homebrew/Cellar/mpich/3.4.3 --with-ranlib=ranlib --with-x11=1
[0]PETSC ERROR: #1 PetscSortedRemoveDupsInt() at /Users/alexismarboeuf/Documents/petsc/src/sys/utils/sorti.c:308
[0]PETSC ERROR: #2 PetscSFCreateEmbeddedLeafSF() at /Users/alexismarboeuf/Documents/petsc/src/vec/is/sf/interface/sf.c:1405
[0]PETSC ERROR: #3 DMPlexCreateGlobalToNaturalSF() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/plexnatural.c:174
[0]PETSC ERROR: #4 DMPlexDistribute() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/plexdistribute.c:1747
[0]PETSC ERROR: #5 main() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/tests/ex44.c:299
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Invalid argument
[1]PETSC ERROR: Input array needs to be sorted
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.16.1-235-g50d0f7a46d GIT Date: 2021-12-19 19:19:22 -0500
[1]PETSC ERROR: ../ex44 on a arch-darwin-c-debug named marboeua-1.math.mcmaster.ca by alexismarboeuf Thu Jan 6 17:23:54 2022
[1]PETSC ERROR: Configure options --download-chaco=1 --download-exodusii=1 --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 --download-ml=1 --download-netcdf=1 --download-pnetcdf=1 --download-sieve=1 --download-sowing=1 --download-yaml=1 --download-zlib=1 --download-metis=1 --download-parmetis=1 --download-ptscotch=1 --with-boost-dir=/opt/homebrew/Cellar/boost/1.76.0 --with-boost=1 --with-c2html=0 --with-debugging=1 --with-fortran-datatypes=1 --with-mpi-dir=/opt/homebrew/Cellar/mpich/3.4.3 --with-ranlib=ranlib --with-x11=1
[1]PETSC ERROR: #1 PetscSortedRemoveDupsInt() at /Users/alexismarboeuf/Documents/petsc/src/sys/utils/sorti.c:308
[1]PETSC ERROR: #2 PetscSFCreateEmbeddedLeafSF() at /Users/alexismarboeuf/Documents/petsc/src/vec/is/sf/interface/sf.c:1405
[1]PETSC ERROR: #3 DMPlexCreateGlobalToNaturalSF() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/plexnatural.c:174
[1]PETSC ERROR: #4 DMPlexDistribute() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/plexdistribute.c:1747
[1]PETSC ERROR: #5 main() at /Users/alexismarboeuf/Documents/petsc/src/dm/impls/plex/tests/ex44.c:299
[1]PETSC ERROR: PETSc Option Table entries:
[1]PETSC ERROR: -field
[1]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -field
[0]PETSC ERROR: -petscpartitioner_type parmetis
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
-petscpartitioner_type parmetis
[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
Abort(62) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
Abort(62) on node 1 (rank 1 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 62) - process 1
running
mpiexec -n 2 ../ex44 -field -petscpartitioner_type parmetis
in $PETSC_DIR/$PETSC_ARCH/tests/dm/impls/plex/tests/runex44_0.
Thanks a lot for your time and your help.
-------------------------------------------------------------------
Alexis Marboeuf
Postdoctoral fellow, Department of Mathematics & Statistics
Hamilton Hall room 409B, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
EMail: marboeua at mcmaster.ca
Tel. +1 (905) 525 9140 ext. 27031
-------------------------------------------------------------------
________________________________
De : Alexis Marboeuf <marboeua at mcmaster.ca>
Envoyé : lundi 20 décembre 2021 15:34
À : Matthew Knepley <knepley at gmail.com>
Cc : petsc-dev at mcs.anl.gov <petsc-dev at mcs.anl.gov>
Objet : RE: [petsc-dev] DMPlexCreateGlobalToNaturalSF and partitioners
Hi Matt,
I created a branch marboeuf/plex-naturaldm starting from knepley/fix-plex-g2n with my minor modifications. It's incomplete now: src/dm/impls/plex/plex.c for Sub or SuperDM are not updated and my new example src/dm/impls/plex/tests/ex47.c just does one call of DMPlexDistribute. But I have the error I mentioned with partitioners. I am configuring PETSc with
./configure --download-chaco=1--download-exodusii=1--download-fblaslapack=1--download-hdf5=1--download-hypre=1--download-metis=1--download-ml=1--download-netcdf=1--download-parmetis=1--download-pnetcdf=1--download-ptscotch=1--download-sieve=1--download-sowing=1--download-yaml=1--download-zlib=1--with-boost-dir=/opt/homebrew/Cellar/boost/1.76.0--with-boost=1--with-c2html=0--with-debugging=1--with-fortran-datatypes=1--with-mpi-dir=/opt/homebrew/Cellar/mpich/3.4.2--with-ranlib=ranlib --with-x11=1
on my MacMook Pro Apple M1 Pro under the arm64 architecture. I am running
mpiexec -n 2 ../ex47 -petscpartitioner_type parmetis
in $PETSC_DIR/$PETSC_ARCH/tests/dm/impls/plex/tests/runex47_0 which raises the error. The example with -petscpartitioner_type simple works fine. ex44 also works fine even with parmetis or ptscotch partitioners. The reason is I defined 2 fields in the section of ex47 which is the only difference with the first test of ex44 for now.
I am not allowed to push my branch to the remote repository under my Username AlexisMarb although I defined an ssh key. As soon as I am able to push it I'll do it so you can checkout and test.
Thanks a lot for your time and help.
-------------------------------------------------------------------
Alexis Marboeuf
Postdoctoral fellow, Department of Mathematics & Statistics
Hamilton Hall room 409B, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
EMail: marboeua at mcmaster.ca
Tel. +1 (905) 525 9140 ext. 27031
-------------------------------------------------------------------
________________________________
De : Matthew Knepley <knepley at gmail.com>
Envoyé : vendredi 17 décembre 2021 17:44
À : Alexis Marboeuf <marboeua at mcmaster.ca>
Cc : petsc-dev at mcs.anl.gov <petsc-dev at mcs.anl.gov>
Objet : Re: [petsc-dev] DMPlexCreateGlobalToNaturalSF and partitioners
On Fri, Dec 17, 2021 at 3:04 PM Alexis Marboeuf <marboeua at mcmaster.ca<mailto:marboeua at mcmaster.ca>> wrote:
Dear PETSc Team,
Following the merge request !4547 for fixing the Global To Natural map, I am implementing modifications to introduce a "natural" DM (i.e. the DM used for IO): see the discussion in https://gitlab.com/petsc/petsc/-/merge_requests/4547. I am writing an example for that, very similar to src/dm/impls/plex/tests/ex44.c added by the merge request, and running on 2 processors. The idea is to: (i) create a DM (the same distributed 2x5 mesh of ex44.c); (ii) call multiple times DMPlexDistribute with different partitioners and set one of the DMs as the "natural" DM; and (iii) see if we can reconstruct correctly the "natural" ordering and distribution from the last DM and the Global To Natural map. I am having some troubles however with at least parmetis and ptscotch partitioners which raise an error inside DMPlexCreateGlobalToNatural. Here is the error:
The easiest thing to do here is to start a branch from the !4547 branch that I can checkout, and then tell me how to run what you are running.
Thanks,
Matt
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Invalid argument
[1]PETSC ERROR: Input array needs to be sorted
Invalid argument
[0]PETSC ERROR: Input array needs to be sorted
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.16.1-435-g007f11b901 GIT Date: 2021-12-01 14:31:21 +0000
Petsc Development GIT revision: v3.16.1-435-g007f11b901 GIT Date: 2021-12-01 14:31:21 +0000
[0]PETSC ERROR: ../ex47 on a arch-darwin-c-debug named marboeua-1.math.mcmaster.ca<http://marboeua-1.math.mcmaster.ca> by alexismarboeuf Thu Dec 16 22:39:20 2021
[1]PETSC ERROR: ../ex47 on a arch-darwin-c-debug named marboeua-1.math.mcmaster.ca<http://marboeua-1.math.mcmaster.ca> by alexismarboeuf Thu Dec 16 22:39:20 2021
[1]PETSC ERROR: [0]PETSC ERROR: Configure options --force --download-fblaslapack=1 --download-exodusii=1 --download-hdf5=1 --download-chaco=1 --download-metis=1 --download-parmetis=1 -download-ptscotch=1 --download-sowing=1 --download-hypre=1 --download-ml=1 --download-netcdf=1 --download-yaml=1 --download-zlib=1 --download-pnetcdf=1 --download-sieve=1 --with-boost=1 --with-boost-dir=/opt/homebrew/Cellar/boost/1.76.0 with-clanguage=C++ --with-c2html=0 --with-fortran-datatypes=1 --with-mpi-dir=/opt/homebrew/Cellar/mpich/3.4.2 --with-debugging=1 --with-ranlib=ranlib --with-x11=1
[0]PETSC ERROR: Configure options --force --download-fblaslapack=1 --download-exodusii=1 --download-hdf5=1 --download-chaco=1 --download-metis=1 --download-parmetis=1 -download-ptscotch=1 --download-sowing=1 --download-hypre=1 --download-ml=1 --download-netcdf=1 --download-yaml=1 --download-zlib=1 --download-pnetcdf=1 --download-sieve=1 --with-boost=1 --with-boost-dir=/opt/homebrew/Cellar/boost/1.76.0 with-clanguage=C++ --with-c2html=0 --with-fortran-datatypes=1 --with-mpi-dir=/opt/homebrew/Cellar/mpich/3.4.2 --with-debugging=1 --with-ranlib=ranlib --with-x11=1
[1]PETSC ERROR: #1 PetscSortedRemoveDupsInt() at /Users/alexismarboeuf/Documents/petsc2/src/sys/utils/sorti.c:308
[0]PETSC ERROR: #2 PetscSFCreateEmbeddedLeafSF() at /Users/alexismarboeuf/Documents/petsc2/src/vec/is/sf/interface/sf.c:1409
[0]PETSC ERROR: #1 PetscSortedRemoveDupsInt() at /Users/alexismarboeuf/Documents/petsc2/src/sys/utils/sorti.c:308
[1]PETSC ERROR: #2 PetscSFCreateEmbeddedLeafSF() at /Users/alexismarboeuf/Documents/petsc2/src/vec/is/sf/interface/sf.c:1409
[1]PETSC ERROR: #3 DMPlexCreateGlobalToNaturalSF() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/plexnatural.c:173
[1]PETSC ERROR: #4 DMPlexDistribute() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/plexdistribute.c:1755
[1]PETSC ERROR: #5 main() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/tests/ex47.c:289
[1]PETSC ERROR: PETSc Option Table entries:
[1]PETSC ERROR: -petscpartitioner_type ptscotch
[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
Abort(62) on node 1 (rank 1 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 62) - process 1
#3 DMPlexCreateGlobalToNaturalSF() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/plexnatural.c:173
[0]PETSC ERROR: #4 DMPlexDistribute() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/plexdistribute.c:1755
[0]PETSC ERROR: #5 main() at /Users/alexismarboeuf/Documents/petsc2/src/dm/impls/plex/tests/ex47.c:289
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -petscpartitioner_type ptscotch
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
Abort(62) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
Thanks all for your help.
-------------------------------------------------------------------
Alexis Marboeuf
Postdoctoral fellow, Department of Mathematics & Statistics
Hamilton Hall room 409B, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
EMail: marboeua at mcmaster.ca<mailto:marboeua at mcmaster.ca>
Tel. +1 (905) 525 9140 ext. 27031
-------------------------------------------------------------------
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220106/2de2a1bb/attachment-0001.html>
More information about the petsc-users
mailing list