[petsc-dev] HDF5 error from GAMG

Barry Smith bsmith at mcs.anl.gov
Thu Sep 21 13:48:29 CDT 2017


  Mark,

   It doesn't look like this a PETSc example (in a PETSc branch). If you can send me full instructions for building/compiling a code that has this behavior please send me all info so I can run it and debug it.

   Barry

Otherwise it is just speculation city.

> On Sep 18, 2017, at 7:44 PM, Mark Adams <mfadams at lbl.gov> wrote:
> 
> I get this strange error when I use GAMG, in parallel. I don't see it in serial and I don't see it with the default solver.  
> 
> The problem seems to be that I use -vec_view in my code and the guts of Vec seem to be picking this up and causing havoc. In the past I have added a prefix to my -[x2_]vec_view and that seemed to work, but I would like to try to fix this and not shove it under the carpet again, if possible.
> 
> It looks like a call to VecAssemblyEnd in the guts of GAMG is calling PetscObjectViewFromOptions. I don't understand why this only happens in parallel.
> 
> Any ideas?
> Thanks,
> 
> 20:27 master= ~/Codes/picell/src$ make run NP=4
> /Users/markadams/Codes/petsc/arch-macosx-gnu-g/bin/mpiexec -n 4 ./picell.arch-macosx-gnu-g -petscspace_poly_tensor -petscspace_order 1 -mstep 2 -debug 3 -snes_monitor -ksp_monitor -snes_converged_reason -dm_refine 4 -dt .1 -use_bsp 10 -num_particles_proc -1 -num_phi_cells 8 -dm_view hdf5:sol.h5 -vec_view hdf5:sol.h5::append -dm_plex_periodic_cut
> [0] **** Warning ****, no global point location. multiple processors (4) not supported
> [0] npe=4; 2 x 2 x 1 flux tube grid; mpi_send size (chunksize) has 64 particles. ions only, BSP communication, phi section = 6.28319
>                 createMesh ntheta_total=4.
> [0] 961 equations on 4 processors, 256 local cells, (element 0 used for flux tube list)
> DM Object: Parallel Mesh 4 MPI processes
>   type: plex
> Parallel Mesh in 2 dimensions:
>   0-cells: 289 289 289 289
>   1-cells: 544 544 544 544
>   2-cells: 256 256 256 256
> Labels:
>   boundary: 1 strata with value/size (1 (65))
>   Face Sets: 2 strata with value/size (1 (31), 4 (31))
>   marker: 1 strata with value/size (1 (65))
>   depth: 3 strata with value/size (0 (289), 1 (544), 2 (256))
>                 [0]shiftParticles: BSP have 1 for proc 0
>                         [0]shiftParticles: BSP have 1 particles for proc 0 (1 chunks)
>                 [0]shiftParticles: BSP sent 1
>                 [2]shiftParticles: BSP sent 0
>                 [3]shiftParticles: BSP sent 0
>                 [1]shiftParticles: BSP sent 0
> 0) 1 local particles, 1/1 global, 0. % total particles moved in 0 messages total (to 1 processors local), 4. load imbalance factor
>   0 SNES Function norm 6.176014522980e-01 
>     0 KSP Residual norm 2.033453504856e+00 
>     1 KSP Residual norm 1.247246018913e-01 
>     2 KSP Residual norm 7.169223999741e-03 
>     3 KSP Residual norm 3.494127469709e-04 
>     4 KSP Residual norm 1.539576656127e-05 
>   1 SNES Function norm 7.013048909984e-06 
>     0 KSP Residual norm 1.539576656641e-05 
>     1 KSP Residual norm 1.233278209501e-06 
>     2 KSP Residual norm 6.474128793205e-08 
>     3 KSP Residual norm 3.824829808334e-09 
>     4 KSP Residual norm 1.813659764651e-10 
>     5 KSP Residual norm 3.684110934970e-12 
>   2 SNES Function norm 2.386983413511e-12 
> Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 2
>                 [0]shiftParticles: BSP have 0 for proc 0
>                 [0]shiftParticles: BSP have 1 for proc 2
>                         [0]shiftParticles: BSP have 1 particles for proc 2 (1 chunks)
>                 [1]shiftParticles: BSP sent 0
>                 [2]shiftParticles: BSP sent 0
>                 [3]shiftParticles: BSP sent 0
>                 [0]shiftParticles: BSP sent 1
> 1) 1 local particles, 1/1 global, 100. % total particles moved in 0 messages total (to 2 processors local), 4. load imbalance factor
>   0 SNES Function norm 6.828943707479e-01 
> HDF5-DIAG: Error detected in HDF5 (1.8.18) HDF5-DIAG: Error detected in HDF5 (1.8.18) MPI-process 3:
>   #000: H5F.c line 604 in H5Fopen(): unable to open file
>     major: File accessibilty
>     minor: Unable to open file
>   #001: H5Fint.c line 1087 in H5F_open(): unable to read superblock
>     major: File accessibilty
>     minor: Read failed
>   #002: H5Fsuper.c line 294 in H5F_super_read(): unable to load superblock
>     major: Object cache
>     minor: Unable to protect metadata
>   #003: H5AC.c line 1262 in H5AC_protect(): H5C_protect() failed.
>     major: Object cache
>     minor: Unable to protect metadata
>   #004: H5C.c line 3574 in H5C_protect(): can't load entry
>     major: Object cache
>     minor: Unable to load metadata into cache
>   #005: H5C.c line 7954 in H5C_load_entry(): unable to load entry
>     major: Object cache
> HDF5-DIAG: Error detected in HDF5 (1.8.18) MPI-process 1:
>   #000: H5F.c line 604 in H5Fopen(): unable to open file
>     major: File accessibilty
>     minor: Unable to open file
>   #001: H5Fint.c line 1087 in H5F_open(): unable to read superblock
>     major: File accessibilty
>     minor: Read failed
>   #002: H5Fsuper.c line 294 in H5F_super_read(): unable to load superblock
>     major: Object cache
>     minor: Unable to protect metadata
>   #003: H5AC.c line 1262 in H5AC_protect(): H5C_protect() failed.
>     major: Object cache
>     minor: Unable to protect metadata
>   #004: H5C.c line 3574 in H5C_protect(): can't load entry
>     major: Object cache
>     minor: Unable to load metadata into cache
>   #005: H5C.c line 7954 in H5C_load_entry(): unable to load entry
>     major: Object cache
>     minor: Unable to load metadata into cache
>   #006: H5Fsuper_cache.c line 476 in H5F_sblock_load(): truncated file: eof = 365072, sblock->base_addr = 0, stored_eoa = 367568
>     major: File accessibilty
>     minor: File has been truncated
>     minor: Unable to load metadata into cache
>   #006: H5Fsuper_cache.c line 476 in H5F_sblock_load(): truncated file: eof = 365072, sblock->base_addr = 0, stored_eoa = 367568
>     major: File accessibilty
>     minor: File has been truncated
> 
> [snip]
> 
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Error in external library
> [0]PETSC ERROR: Error in HDF5 call H5Fopen() Status -1
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.6-5541-gd07ad3ccba  GIT Date: 2017-09-17 13:49:32 -0400
> [0]PETSC ERROR: ./picell.arch-macosx-gnu-g on a arch-macosx-gnu-g named MarksMac-5.local by markadams Mon Sep 18 20:27:26 2017
> [0]PETSC ERROR: Configure options --with-cc=clang --with-cc++=clang++ COPTFLAGS="-g -mavx2" CXXOPTFLAGS="-g -mavx2" FOPTFLAGS="-g -mavx2" --download-mpich=1 --download-hypre=1 --download-metis=1 --download-parmetis=1 --download-c2html=1 --download-ctetgen --download-p4est=1 --download-superlu_dist --download-superlu --download-triangle=1 --download-hdf5=1 --download-zlib --with-x=0 --with-debugging=1 PETSC_ARCH=arch-macosx-gnu-g --download-chaco --with-viewfromoptions=1
> [0]PETSC ERROR: #1 PetscViewerFileSetName_HDF5() line 242 in /Users/markadams/Codes/petsc/src/sys/classes/viewer/impls/hdf5/hdf5v.c
> [0]PETSC ERROR: #2 PetscViewerFileSetName() line 650 in /Users/markadams/Codes/petsc/src/sys/classes/viewer/impls/ascii/filev.c
> [0]PETSC ERROR: #3 PetscOptionsGetViewer() line 339 in /Users/markadams/Codes/petsc/src/sys/classes/viewer/interface/viewreg.c
> [0]PETSC ERROR: #4 PetscObjectViewFromOptions() line 2773 in /Users/markadams/Codes/petsc/src/sys/objects/options.c
> [0]PETSC ERROR: #5 VecAssemblyEnd() line 182 in /Users/markadams/Codes/petsc/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: #6 PCGAMGGetDataWithGhosts() line 376 in /Users/markadams/Codes/petsc/src/ksp/pc/impls/gamg/util.c
> [0]PETSC ERROR: #7 PCGAMGProlongator_AGG() line 1071 in /Users/markadams/Codes/petsc/src/ksp/pc/impls/gamg/agg.c
> [0]PETSC ERROR: #8 PCSetUp_GAMG() line 519 in /Users/markadams/Codes/petsc/src/ksp/pc/impls/gamg/gamg.c
> [0]PETSC ERROR: #9 PCSetUp() line 924 in /Users/markadams/Codes/petsc/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #10 KSPSetUp() line 378 in /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #11 KSPSolve() line 609 in /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #12 SNESSolve_NEWTONLS() line 224 in /Users/markadams/Codes/petsc/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: #13 SNESSolve() line 4106 in /Users/markadams/Codes/petsc/src/snes/interface/snes.c
> [0]PETSC ERROR: #14 DMPICellSolve() line 18 in /Users/markadams/Codes/petsc/src/snes/utils/dmpicellsnes.c
> [0]PETSC ERROR: #15 go() line 1120 in /Users/markadams/Codes/picell/src/main.c
> [0]PETSC ERROR: #16 main() line 1304 in /Users/markadams/Codes/picell/src/main.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -debug 3
> [0]PETSC ERROR: -dm_plex_periodic_cut
> [0]PETSC ERROR: -dm_refine 4
> [0]PETSC ERROR: -dm_view hdf5:sol.h5
> [0]PETSC ERROR: -dt .1
> [0]PETSC ERROR: -ksp_monitor
> [0]PETSC ERROR: -ksp_type cg
> [0]PETSC ERROR: -mstep 2
> [0]PETSC ERROR: -num_particles_proc -1
> [0]PETSC ERROR: -num_phi_cells 8
> [0]PETSC ERROR: -options_left
> [0]PETSC ERROR: -pc_type gamg
> [0]PETSC ERROR: -petscspace_order 1
> [0]PETSC ERROR: -petscspace_poly_tensor
> [0]PETSC ERROR: -snes_converged_reason
> [0]PETSC ERROR: -snes_monitor
> [0]PETSC ERROR: -use_bsp 10
> [0]PETSC ERROR: -vec_view hdf5:sol.h5::append
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0
> 



More information about the petsc-dev mailing list