[petsc-users] How to run snes ex12 with petsc-3.4.4
Zhang
zyzhang at nuaa.edu.cn
Tue May 20 07:31:11 CDT 2014
Dear All,
I am trying the PetscFEM solver with petsc-3.4.4.
But when I run snes/ex12, I always got run time errors.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[1]PETSC ERROR: [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR: INSTEAD the line number of the start of the function
[1]PETSC ERROR: is given.
[1]PETSC ERROR: [1] DMPlexProjectFunctionLocal line 230 /home/zhenyu/petsc-3.4.4/src/dm/impls/plex/plexfem.c
[1]PETSC ERROR: [1] DMPlexProjectFunction line 338 /home/zhenyu/petsc-3.4.4/src/dm/impls/plex/plexfem.c
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] DMPlexProjectFunctionLocal line 230 /home/zhenyu/petsc-3.4.4/src/dm/impls/plex/plexfem.c
[0]PETSC ERROR: [0] DMPlexProjectFunction line 338 /home/zhenyu/petsc-3.4.4/src/dm/impls/plex/plexfem.c
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.4, Mar, 13, 2014
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Signal received!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 3.4.4, Mar, 13, 2014
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
ex12 on a arch-linux2-c-opt named toshiba by zhenyu Tue May 20 20:26:56 2014
[1]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.4/arch-linux2-c-opt/lib
[1]PETSC ERROR: Configure run at Mon May 19 23:24:37 2014
[1]PETSC ERROR: Configure options --download-cmake=1 --download-fblaslapack=1 --download-f2cblaslapack=1 --download-fftw=1 --download-ptscotch=1 --download-ctetgen=1 --download-petsc4py=1 --download-ml=1 --download-parmetis=1 --download-metis=1 --download-superlu_dist=1 --download-hypre=1 --download-c2html=1 --download-generator=1 --download-fiat=1 --download-scientificpython=1 --download-sowing=1 --download-triangle=1 --download-chaco=1 --download-boost=1 --download-exodusii=1 --download-netcdf=1 --download-netcdf-shared=1 --download-hdf5=1 --download-moab-shared=1 --download-suitesparse=1 --with-mpi-dir=/home/zhenyu/deps/openmpi-1.6.5 --with-pthread=1 --with-valgrind=1
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ex12 on a arch-linux2-c-opt named toshiba by zhenyu Tue May 20 20:26:56 2014
[0]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.4/arch-linux2-c-opt/lib
[0]PETSC ERROR: Configure run at Mon May 19 23:24:37 2014
[0]PETSC ERROR: Configure options --download-cmake=1 --download-fblaslapack=1 --download-f2cblaslapack=1 --download-fftw=1 --download-ptscotch=1 --download-ctetgen=1 --download-petsc4py=1 --download-ml=1 --download-parmetis=1 --download-metis=1 --download-superlu_dist=1 --download-hypre=1 --download-c2html=1 --download-generator=1 --download-fiat=1 --download-scientificpython=1 --download-sowing=1 --download-triangle=1 --download-chaco=1 --download-boost=1 --download-exodusii=1 --download-netcdf=1 --download-netcdf-shared=1 --download-hdf5=1 --download-moab-shared=1 --download-suitesparse=1 --with-mpi-dir=/home/zhenyu/deps/openmpi-1.6.5 --with-pthread=1 --with-valgrind=1
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 3027 on
node toshiba exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[toshiba:03025] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[toshiba:03025] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Well, for a smooth compiling, I made two correction to ex12.c
Line 195: options->fem.bcFuncs = (void (**)(const PetscReal[], PetscScalar *)) &options->exactFuncs;
Line 574: void (*initialGuess[numComponents])(const PetscReal x[],PetscScalar* u);
then generate ex12.h by
PETSC_DIR=$HOME/petsc-3.4.4
DIM=2
ORDER=1
CASE=ex12
$PETSC_DIR/bin/pythonscripts/PetscGenerateFEMQuadrature.py \
$DIM $ORDER $DIM 1 laplacian \
$DIM $ORDER $DIM 1 boundary \
$PETSC_DIR/src/snes/examples/tutorials/$CASE.h
Since I am still not fully master the machnism of PetscFEM,
could anyone show me a proper way to run this demo? Many thanks.
Zhenyu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140520/a3f23ef3/attachment.html>
More information about the petsc-users
mailing list