[petsc-users] DMForestTransferVec with -petscspace_order 0
Tobin Isaac
tisaac at cc.gatech.edu
Wed Apr 4 08:18:42 CDT 2018
Hi Yann,
On Tue, Apr 03, 2018 at 05:29:39PM +0200, Yann Jobic wrote:
> Hi,
>
> Thanks for the fast answer ! And sorry for my late one...
> As a test, i'm using ex2.c in test/forest directory.
>
> I'm using 2 git depot (master), one from this week end, and one of today. I
> see some different behaviors.
>
> For the git from the week end : v3.8.4-1-g756c7f9
> In 2D, everything is ok (petscspace_order = 0,1,2) : mpiexec -n 1 ./ex2
> -petscspace_poly_tensor -petscspace_order 2 -dim 2
> In 3D, only -petscspace_order 2 works. For the other ones, i put the full
> message in a log file.
This looks like a version of petsc/maint commit. I have not tried to
fix this (a new maintenance release is imminent).
>
> For the git from today : v3.8.4-2420-g8f4cb0b
> In 2D, petscspace_order is ok for 2 and 1, but do not work for 0.
> In 3D, same as 2D, it's working for petscspace_order 2 and 1, but not for 0.
> (see log file)
> Many thanks for the help !
I have fixed this in the branch `tisaac/fix-height-ds` on the repo.
It does not require any interface changes, so as soon as it is
approved, it will be integrated into the master branch and
to-be-released maintained version v3.8.9.
Cheers,
Toby
>
> Regards,
>
> Yann
>
>
> Le 03/04/2018 à 03:33, Tobin Isaac a écrit :
> > Hi Yann,
> >
> > Thanks for pointing this out to us. Matt and I are the two most
> > actively developing in this area. We have been working on separate
> > threads and this looks like an issue where we need to sync up. I
> > think there is a simple fix, but it would be helpful to know which
> > version petsc you're working from: I'm seeing different
> > behavior. Could you please send along more complete output?
> >
> > Cheers,
> > Toby
> >
> > On Mon, Apr 02, 2018 at 04:42:29PM +0200, yann JOBIC wrote:
> > > Hi,
> > >
> > > I'm using DMForestTransferVec, as in "dm/impls/forest/examples/tests/ex2.c".
> > >
> > > I would like to use it with a space approximation order at zero
> > > (-petscspace_order 0). However, in this case, it's not working (valgrind
> > > output of ex2.c from forest test) :
> > >
> > > ==8604== Conditional jump or move depends on uninitialised value(s)
> > > ==8604== at 0x47D74F: DMPlexVecGetClosure (plex.c:4035)
> > > ==8604== by 0x557612: DMPlexLocatePoint_General_3D_Internal
> > > (plexgeometry.c:153)
> > > ==8604== by 0x559B85: DMPlexLocatePoint_Internal (plexgeometry.c:383)
> > > ==8604== by 0x611EED: DMPlexComputeInjectorReferenceTree
> > > (plextree.c:3247)
> > > ==8604== by 0x6148DB: DMPlexReferenceTreeGetInjector (plextree.c:3454)
> > > ==8604== by 0x61EAD8: DMPlexTransferVecTree_Inject (plextree.c:4319)
> > > ==8604== by 0x620CC1: DMPlexTransferVecTree (plextree.c:4527)
> > > ==8604== by 0x7F23D8: DMForestTransferVec_p8est (pforest.c:4239)
> > > ==8604== by 0x429B48: DMForestTransferVec (forest.c:985)
> > > ==8604== by 0x40BB8A: main (transf_test.c:136)
> > >
> > > With the error message :
> > >
> > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > > probably memory access out of range
> > >
> > > It's working fine in 2D (p4est). The problem arise in 3D (p8est).
> > >
> > > Is it an expected behavior ? Am i doing something wrong ?
> > >
> > > Thanks in advance,
> > >
> > > Yann
> > >
>
> --
> ___________________________
>
> Yann JOBIC
> HPC engineer
> IUSTI-CNRS UMR 7343 - Polytech Marseille
> Technopôle de Château Gombert
> 5 rue Enrico Fermi
> 13453 Marseille cedex 13
> Tel : (33) 4 91 10 69 43
> Fax : (33) 4 91 10 69 69
>
>
>
> ---
> L'absence de virus dans ce courrier électronique a été vérifiée par le logiciel antivirus Avast.
> https://www.avast.com/antivirus
> yann at crabe:~/projet/AMR/moulinette$ mpiexec -n 1 ./ex2 -petscspace_poly_tensor -petscspace_order 0 -dim 3
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
> [0]PETSC ERROR: is given.
> [0]PETSC ERROR: [0] DMPlexVecGetClosure line 4020 /home/yann/src/petsc-git/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: [0] DMPlexLocatePoint_General_3D_Internal line 150 /home/yann/src/petsc-git/src/dm/impls/plex/plexgeometry.c
> [0]PETSC ERROR: [0] DMPlexLocatePoint_Internal line 361 /home/yann/src/petsc-git/src/dm/impls/plex/plexgeometry.c
> [0]PETSC ERROR: [0] DMPlexComputeInjectorReferenceTree line 3008 /home/yann/src/petsc-git/src/dm/impls/plex/plextree.c
> [0]PETSC ERROR: [0] DMPlexReferenceTreeGetInjector line 3449 /home/yann/src/petsc-git/src/dm/impls/plex/plextree.c
> [0]PETSC ERROR: [0] DMPlexTransferVecTree_Inject line 4309 /home/yann/src/petsc-git/src/dm/impls/plex/plextree.c
> [0]PETSC ERROR: [0] DMPlexTransferVecTree line 4486 /home/yann/src/petsc-git/src/dm/impls/plex/plextree.c
> [0]PETSC ERROR: [0] DMForestTransferVec_p8est line 4213 /home/yann/src/petsc-git/src/dm/impls/forest/p4est/pforest.c
> [0]PETSC ERROR: [0] DMForestTransferVec line 978 /home/yann/src/petsc-git/src/dm/impls/forest/forest.c
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.8.4, unknown
> [0]PETSC ERROR: ./ex2 on a depot-git named crabe by yann Tue Apr 3 17:18:02 2018
> [0]PETSC ERROR: Configure options --prefix=/local/lib/petsc/depot-git/gcc/openmpi_gcc_all --with-single-library=0 --with-debugging=1 --download-scalapack=1 --download-metis=1 --download-parmetis=1 --download-ptscotch=1 --download-mumps=1 --download-hypre=1 --download-superlu=1 --download-superlu_dist=1 --download-fblaslapack=1 --download-metis=1 --download-cmake=1 --download-ml=1 --download-p4est=1 --download-netcdf=1 --download-pragmatic=1 --download-eigen=1 --download-parms=1 --download-triangle=1 --download-hdf5=1 --with-zlib=1 --download-szlib=1 --download-chaco=1 --download-spai=1 --download-suitesparse=1 --with-shared-libraries=0 PETSC_ARCH=depot-git
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> -------------------------------------------------------
> Primary job terminated normally, but 1 process returned
> a non-zero exit code.. Per user-direction, the job has been aborted.
> -------------------------------------------------------
> --------------------------------------------------------------------------
> mpiexec detected that one or more processes exited with non-zero status, thus causing
> the job to be terminated. The first process to do so was:
>
> Process name: [[36533,1],0]
> Exit code: 59
> --------------------------------------------------------------------------
> jobic at stargate:~/projet/AMR/moulinette$ mpirun -np 1 ./ex2 -petscspace_poly_tensor -dim 3 -petscspace_order 0
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 2
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.4-2420-g8f4cb0b GIT Date: 2018-04-02 16:00:52 -0500
> [0]PETSC ERROR: ./ex2 on a named stargate by jobic Tue Apr 3 17:22:04 2018
> [0]PETSC ERROR: Configure options --download-chaco=1 --download-cmake=1 --download-eigen=1 --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 --download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 --download-p4est=1 --download-parmetis=1 --download-parms=1 --download-pragmatic=1 --download-ptscotch=1 --download-scalapack=1 --download-spai=1 --download-suitesparse=1 --download-superlu=1 --download-superlu_dist=1 --download-szlib=1 --download-triangle=1 --prefix=/local/lib/petsc/git/gcc/openmpi_gcc_all --with-debugging=1 --with-shared-libraries=0 --with-single-library=0 --with-zlib=1 PETSC_ARCH=openmpi_gcc_all
> [0]PETSC ERROR: #1 PetscFESetDualSpace() line 3615 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtfe.c
> [0]PETSC ERROR: #2 PetscFEGetHeightSubspace() line 6630 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtfe.c
> [0]PETSC ERROR: #3 PetscDSGetHeightSubspace() line 2880 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtds.c
> [0]PETSC ERROR: #4 DMProjectLocal_Generic_Plex() line 325 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/plex/plexproject.c
> [0]PETSC ERROR: #5 DMProjectFunctionLocal_Plex() line 428 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/plex/plexproject.c
> [0]PETSC ERROR: #6 DMProjectFunctionLocal() line 6265 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #7 DMProjectFunctionLocal_p8est() line 4385 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/forest/p4est/pforest.c
> [0]PETSC ERROR: #8 DMProjectFunctionLocal() line 6265 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #9 DMProjectFunction() line 6250 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #10 main() line 161 in /home/jobic/projet/AMR/moulinette/ex2.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -dim 3
> [0]PETSC ERROR: -petscspace_order 0
> [0]PETSC ERROR: -petscspace_poly_tensor
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 85.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
>
>
>
> With valgrind :
> jobic at stargate:~/projet/AMR/moulinette$ valgrind mpirun -np 1 ./ex2 -petscspace_poly_tensor -dim 3 -petscspace_order 0
> ==9730== Memcheck, a memory error detector
> ==9730== Copyright (C) 2002-2015, and GNU GPL'd, by Julian Seward et al.
> ==9730== Using Valgrind-3.12.0 and LibVEX; rerun with -h for copyright info
> ==9730== Command: mpirun -np 1 ./ex2 -petscspace_poly_tensor -dim 3 -petscspace_order 0
> ==9730==
> ==10212== Warning: invalid file descriptor 1024 in syscall close()
> ==10212== Warning: invalid file descriptor 1025 in syscall close()
> ==10212== Warning: invalid file descriptor 1026 in syscall close()
> ==10212== Warning: invalid file descriptor 1027 in syscall close()
> ==10212== Use --log-fd=<number> to select an alternative log fd.
> ==10212== Warning: invalid file descriptor 1028 in syscall close()
> ==10212== Warning: invalid file descriptor 1029 in syscall close()
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 2
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.4-2420-g8f4cb0b GIT Date: 2018-04-02 16:00:52 -0500
> [0]PETSC ERROR: ./ex2 on a named stargate by jobic Tue Apr 3 17:27:58 2018
> [0]PETSC ERROR: Configure options --download-chaco=1 --download-cmake=1 --download-eigen=1 --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 --download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 --download-p4est=1 --download-parmetis=1 --download-parms=1 --download-pragmatic=1 --download-ptscotch=1 --download-scalapack=1 --download-spai=1 --download-suitesparse=1 --download-superlu=1 --download-superlu_dist=1 --download-szlib=1 --download-triangle=1 --prefix=/local/lib/petsc/git/gcc/openmpi_gcc_all --with-debugging=1 --with-shared-libraries=0 --with-single-library=0 --with-zlib=1 PETSC_ARCH=openmpi_gcc_all
> [0]PETSC ERROR: #1 PetscFESetDualSpace() line 3615 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtfe.c
> [0]PETSC ERROR: #2 PetscFEGetHeightSubspace() line 6630 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtfe.c
> [0]PETSC ERROR: #3 PetscDSGetHeightSubspace() line 2880 in /home/devel/src_linux/git.petsc/petsc/src/dm/dt/interface/dtds.c
> [0]PETSC ERROR: #4 DMProjectLocal_Generic_Plex() line 325 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/plex/plexproject.c
> [0]PETSC ERROR: #5 DMProjectFunctionLocal_Plex() line 428 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/plex/plexproject.c
> [0]PETSC ERROR: #6 DMProjectFunctionLocal() line 6265 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #7 DMProjectFunctionLocal_p8est() line 4385 in /home/devel/src_linux/git.petsc/petsc/src/dm/impls/forest/p4est/pforest.c
> [0]PETSC ERROR: #8 DMProjectFunctionLocal() line 6265 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #9 DMProjectFunction() line 6250 in /home/devel/src_linux/git.petsc/petsc/src/dm/interface/dm.c
> [0]PETSC ERROR: #10 main() line 161 in /home/jobic/projet/AMR/moulinette/ex2.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -dim 3
> [0]PETSC ERROR: -petscspace_order 0
> [0]PETSC ERROR: -petscspace_poly_tensor
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 85.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> ==9730==
> ==9730== HEAP SUMMARY:
> ==9730== in use at exit: 55,755 bytes in 116 blocks
> ==9730== total heap usage: 19,779 allocs, 19,663 frees, 6,633,193 bytes allocated
> ==9730==
> ==9730== LEAK SUMMARY:
> ==9730== definitely lost: 2,675 bytes in 11 blocks
> ==9730== indirectly lost: 48,765 bytes in 75 blocks
> ==9730== possibly lost: 200 bytes in 1 blocks
> ==9730== still reachable: 4,115 bytes in 29 blocks
> ==9730== suppressed: 0 bytes in 0 blocks
> ==9730== Rerun with --leak-check=full to see details of leaked memory
> ==9730==
> ==9730== For counts of detected and suppressed errors, rerun with: -v
> ==9730== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
More information about the petsc-users
mailing list