[petsc-dev] Possible SF bug

Matthew Knepley knepley at gmail.com
Mon Mar 29 14:16:11 CDT 2021


Junchao,

I have an SF problem, which I think is a caching bug, but it is hard to see
what is happening in the internals. I have made a small example which
should help you see what is wrong. It is attached.

If you run without arguments, you get

master *:~/Downloads/tmp/Salac$ ./forestHDF
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Null argument, when expecting valid pointer
[0]PETSC ERROR: Trying to copy to a null pointer
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.14.5-879-g03cacdc99d
 GIT Date: 2021-03-22 01:02:08 +0000
[0]PETSC ERROR: ./forestHDF on a arch-master-debug named
MacBook-Pro.fios-router.home by knepley Mon Mar 29 15:14:16 2021
[0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
--download-bamg --download-chaco --download-ctetgen --download-egads
--download-eigen --download-exodusii --download-fftw --download-hpddm
--download-libpng --download-metis --download-ml --download-mumps
--download-netcdf --download-opencascade --download-p4est
--download-parmetis --download-pnetcdf --download-scalapack
--download-slepc --download-suitesparse --download-superlu_dist
--download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake
--with-ctest-exec=/PETSc3/petsc/apple/bin/ctest
--with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple
--with-shared-libraries --with-slepc --with-zlib --download-tetgen
[0]PETSC ERROR: #1 PetscMemcpy() at
/PETSc3/petsc/petsc-dev/include/petscsys.h:1798
[0]PETSC ERROR: #2 UnpackAndInsert_PetscReal_1_1() at
/PETSc3/petsc/petsc-dev/src/vec/is/sf/impls/basic/sfpack.c:426
[0]PETSC ERROR: #3 ScatterAndInsert_PetscReal_1_1() at
/PETSc3/petsc/petsc-dev/src/vec/is/sf/impls/basic/sfpack.c:426
[0]PETSC ERROR: #4 PetscSFLinkScatterLocal() at
/PETSc3/petsc/petsc-dev/src/vec/is/sf/impls/basic/sfpack.c:1248
[0]PETSC ERROR: #5 PetscSFBcastBegin_Basic() at
/PETSc3/petsc/petsc-dev/src/vec/is/sf/impls/basic/sfbasic.c:193
[0]PETSC ERROR: #6 PetscSFBcastWithMemTypeBegin() at
/PETSc3/petsc/petsc-dev/src/vec/is/sf/interface/sf.c:1493
[0]PETSC ERROR: #7 DMGlobalToLocalBegin() at
/PETSc3/petsc/petsc-dev/src/dm/interface/dm.c:2565
[0]PETSC ERROR: #8 VecView_Plex_HDF5_Internal() at
/PETSc3/petsc/petsc-dev/src/dm/impls/plex/plexhdf5.c:251
[0]PETSC ERROR: #9 VecView_Plex() at
/PETSc3/petsc/petsc-dev/src/dm/impls/plex/plex.c:385
[0]PETSC ERROR: #10 VecView_p4est() at
/PETSc3/petsc/petsc-dev/src/dm/impls/forest/p4est/pforest.c:4922
[0]PETSC ERROR: #11 VecView() at
/PETSc3/petsc/petsc-dev/src/vec/vec/interface/vector.c:613
[0]PETSC ERROR: #12 main() at
/Users/knepley/Downloads/tmp/Salac/forestHDF.c:53
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -malloc_debug
[0]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_SELF, 53001) - process 0
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=53001

If you run with

  ./forestHDF -write_early

or

  ./forestHDF -no_g2l

Then it is fine. Thus it appears to me that if you run a G2L at the wrong
time, something is incorrectly cached.

  Thanks,

    Matt

--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20210329/7e1ae22f/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: forestHDF.c
Type: application/octet-stream
Size: 2475 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20210329/7e1ae22f/attachment.obj>


More information about the petsc-dev mailing list