<div dir="ltr"><div><div><div><div><div>Hello Nicolas, <br></div>I ran snes/examples/tutorials/ex1f -snes_type fas with a recent version (3.7.6) and I confirm the problem. <br></div>The C version works fine, but the Fortran version complains about a Fortran callback problem. <br></div>My output looks quite similar to yours ... <br></div>Best regards,<br></div>Natacha <br><div><div><div><br>mpirun ex1f -snes_type fas<br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Corrupt argument: <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>[0]PETSC ERROR: Fortran callback not set on this object<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 <br>[0]PETSC ERROR: ex1f on a linux-opt-mumps-ml-hypre named dsp0780444 by H03755 Wed Jun 7 10:42:55 2017<br>[0]PETSC ERROR: Configure options --with-mpi=1 --with-debugging=0 --with-mumps-lib="-L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Mumps-511_consortium_aster/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Parmetis_aster-403_aster/lib -lparmetis -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Scotch_aster-604_aster6/MPI/lib -lptscotch -lptscotcherr -lptscotcherrexit -lptscotchparmetis -lesmumps -lscotch -lscotcherr -lscotcherrexit -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Metis_aster-510_aster1/lib -lmetis" --with-mumps-include=/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Mumps-511_consortium_aster/MPI/include --download-hypre=/home/H03755/Librairies/hypre-2.11.1.tar.gz --download-ml=/home/H03755/Librairies/petsc-pkg-ml-e5040d11aa07.tar.gz --with-openmp=0 --with-scalapack-lib="-lscalapack-openmpi -lblacs-openmpi -lblacsF77init-openmpi -lblacsCinit-openmpi" --with-blas-lapack-lib="-llapack -lopenblas" --PETSC_ARCH=linux-opt-mumps-ml-hypre LIBS=-lgomp --prefix=/home/H03755/local/petsc/petsc-3.7.6<br>[0]PETSC ERROR: #1 PetscObjectGetFortranCallback() line 263 in /home/H03755/Librairies/petsc-3.7.6/src/sys/objects/inherit.c<br>[0]PETSC ERROR: #2 oursnesjacobian() line 105 in /home/H03755/Librairies/petsc-3.7.6/src/snes/interface/ftn-custom/zsnesf.c<br>[0]PETSC ERROR: #3 SNESComputeJacobian() line 2312 in /home/H03755/Librairies/petsc-3.7.6/src/snes/interface/snes.c<br>[0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /home/H03755/Librairies/petsc-3.7.6/src/snes/impls/ls/ls.c<br>[0]PETSC ERROR: #5 SNESSolve() line 4005 in /home/H03755/Librairies/petsc-3.7.6/src/snes/interface/snes.c<br>[0]PETSC ERROR: #6 SNESFASDownSmooth_Private() line 512 in /home/H03755/Librairies/petsc-3.7.6/src/snes/impls/fas/fas.c<br>[0]PETSC ERROR: #7 SNESFASCycle_Multiplicative() line 816 in /home/H03755/Librairies/petsc-3.7.6/src/snes/impls/fas/fas.c<br>[0]PETSC ERROR: #8 SNESSolve_FAS() line 987 in /home/H03755/Librairies/petsc-3.7.6/src/snes/impls/fas/fas.c<br>[0]PETSC ERROR: #9 SNESSolve() line 4005 in /home/H03755/Librairies/petsc-3.7.6/src/snes/interface/snes.c<br>Number of SNES iterations = 0<br>************************************************************************************************************************<br>*** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***<br>************************************************************************************************************************<br><br>---------------------------------------------- PETSc Performance Summary: ----------------------------------------------<br><br>ex1f on a linux-opt-mumps-ml-hypre named dsp0780444 with 1 processor, by H03755 Wed Jun 7 10:42:55 2017<br>Using Petsc Release Version 3.7.6, Apr, 24, 2017 <br><br> Max Max/Min Avg Total <br>Time (sec): 7.548e-03 1.00000 7.548e-03<br>Objects: 2.300e+01 1.00000 2.300e+01<br>Flops: 3.000e+00 1.00000 3.000e+00 3.000e+00<br>Flops/sec: 3.975e+02 1.00000 3.975e+02 3.975e+02<br>MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00<br>MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00<br>MPI Reductions: 0.000e+00 0.00000<br><br>Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)<br> e.g., VecAXPY() for real vectors of length N --> 2N flops<br> and VecAXPY() for complex vectors of length N --> 8N flops<br><br>Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions --<br> Avg %Total Avg %Total counts %Total Avg %Total counts %Total <br> 0: Main Stage: 7.5421e-03 99.9% 3.0000e+00 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% <br><br>------------------------------------------------------------------------------------------------------------------------<br>See the 'Profiling' chapter of the users' manual for details on interpreting output.<br>Phase summary info:<br> Count: number of times phase was executed<br> Time and Flops: Max - maximum over all processors<br> Ratio - ratio of maximum to minimum over all processors<br> Mess: number of messages sent<br> Avg. len: average message length (bytes)<br> Reduct: number of global reductions<br> Global: entire computation<br> Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().<br> %T - percent time in this phase %F - percent flops in this phase<br> %M - percent messages in this phase %L - percent message lengths in this phase<br> %R - percent reductions in this phase<br> Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)<br>------------------------------------------------------------------------------------------------------------------------<br>Event Count Time (sec) Flops --- Global --- --- Stage --- Total<br> Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s<br>------------------------------------------------------------------------------------------------------------------------<br><br>--- Event Stage 0: Main Stage<br><br>SNESFunctionEval 1 1.0 7.8678e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0<br>VecNorm 1 1.0 6.1989e-06 1.0 3.00e+00 1.0 0.0e+00 0.0e+00 0.0e+00 0100 0 0 0 0100 0 0 0 0<br>VecSet 8 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0<br>MatZeroEntries 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0<br>------------------------------------------------------------------------------------------------------------------------<br><br>Memory usage is given in bytes:<br><br>Object Type Creations Destructions Memory Descendants' Mem.<br>Reports information only for process 0.<br><br>--- Event Stage 0: Main Stage<br><br> SNES 2 2 2828 0.<br> SNESLineSearch 2 2 1992 0.<br> DMSNES 1 1 672 0.<br> Vector 6 6 9312 0.<br> Matrix 2 2 6536 0.<br> Distributed Mesh 1 1 4624 0.<br>Star Forest Bipartite Graph 2 2 1616 0.<br> Discrete System 1 1 872 0.<br> Krylov Solver 2 2 2704 0.<br> DMKSP interface 1 1 656 0.<br> Preconditioner 2 2 1832 0.<br> Viewer 1 0 0 0.<br>========================================================================================================================<br>Average time to get PetscTime(): 0.<br>#PETSc Option Table entries:<br>-ksp_monitor<br>-ksp_view<br>-log_view<br>-snes_type fas<br>#End of PETSc Option Table entries<br>Compiled without FORTRAN kernels<br>Compiled with full precision matrices (default)<br>sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4<br>Configure options: --with-mpi=1 --with-debugging=0 --with-mumps-lib="-L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Mumps-511_consortium_aster/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Parmetis_aster-403_aster/lib -lparmetis -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Scotch_aster-604_aster6/MPI/lib -lptscotch -lptscotcherr -lptscotcherrexit -lptscotchparmetis -lesmumps -lscotch -lscotcherr -lscotcherrexit -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Metis_aster-510_aster1/lib -lmetis" --with-mumps-include=/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Mumps-511_consortium_aster/MPI/include --download-hypre=/home/H03755/Librairies/hypre-2.11.1.tar.gz --download-ml=/home/H03755/Librairies/petsc-pkg-ml-e5040d11aa07.tar.gz --with-openmp=0 --with-scalapack-lib="-lscalapack-openmpi -lblacs-openmpi -lblacsF77init-openmpi -lblacsCinit-openmpi" --with-blas-lapack-lib="-llapack -lopenblas" --PETSC_ARCH=linux-opt-mumps-ml-hypre LIBS=-lgomp --prefix=/home/H03755/local/petsc/petsc-3.7.6<br>-----------------------------------------<br>Libraries compiled on Fri Apr 28 15:23:58 2017 on dsp0780444 <br>Machine characteristics: Linux-3.16.0-4-amd64-x86_64-with-debian-8.7<br>Using PETSc directory: /home/H03755/Librairies/petsc-3.7.6<br>Using PETSc arch: linux-opt-mumps-ml-hypre<br>-----------------------------------------<br><br>Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS}<br>Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} <br>-----------------------------------------<br><br>Using include paths: -I/home/H03755/Librairies/petsc-3.7.6/linux-opt-mumps-ml-hypre/include -I/home/H03755/Librairies/petsc-3.7.6/include -I/home/H03755/Librairies/petsc-3.7.6/include -I/home/H03755/Librairies/petsc-3.7.6/linux-opt-mumps-ml-hypre/include -I/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Mumps-511_consortium_aster/MPI/include -I/home/H03755/local/petsc/petsc-3.7.6/include<br>-----------------------------------------<br><br>Using C linker: mpicc<br>Using Fortran linker: mpif90<br>Using libraries: -Wl,-rpath,/home/H03755/Librairies/petsc-3.7.6/linux-opt-mumps-ml-hypre/lib -L/home/H03755/Librairies/petsc-3.7.6/linux-opt-mumps-ml-hypre/lib -lpetsc -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Mumps-511_consortium_aster/MPI/lib -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Parmetis_aster-403_aster/lib -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Scotch_aster-604_aster6/MPI/lib -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites//Metis_aster-510_aster1/lib -Wl,-rpath,/home/H03755/local/petsc/petsc-3.7.6/lib -L/home/H03755/local/petsc/petsc-3.7.6/lib -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lzmumps -ldmumps -lmumps_common -lpord -lparmetis -lptscotch -lptscotcherr -lptscotcherrexit -lptscotchparmetis -lesmumps -lscotch -lscotcherr -lscotcherrexit -lmetis -lHYPRE -lmpi_cxx -lstdc++ -lm -lscalapack-openmpi -lblacs-openmpi -lblacsF77init-openmpi -lblacsCinit-openmpi -lml -lmpi_cxx -lstdc++ -lm -llapack -lopenblas -lX11 -lssl -lcrypto -lm -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lmpi_cxx -lstdc++ -lm -ldl -lgomp -lmpi -lhwloc -lgcc_s -lpthread -ldl -lgomp<br><br></div></div></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Jun 5, 2017 at 10:12 AM, Karin&NiKo <span dir="ltr"><<a href="mailto:niko.karin@gmail.com" target="_blank">niko.karin@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Dear PETSc team,<br><div><br></div><div>If I run "snes/examples/tutorials/ex1 -snes_type fas", everything is OK. But with its Fortran version "snes/examples/<wbr>tutorials/ex1f -snes_type fas", I get a segfault (see error below).</div><div>Do you confirm or did I miss something? </div><div><br></div><div>Best regards,</div><div>Nicolas</div><div><br></div><div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>--------------</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Corrupt argument: <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a></div><div>[0]PETSC ERROR: Fortran callback not set on this object</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 </div><div>[0]PETSC ERROR: on a arch-linux2-c-debug named dsp0780450 by niko Thu Jun 1 16:18:43 2017</div><div>[0]PETSC ERROR: Configure options --prefix=/home/niko/dev/<wbr>codeaster-prerequisites/petsc-<wbr>3.7.2/Install --with-mpi=yes --with-x=yes --download-ml=/home/niko/dev/<wbr>codeaster-prerequisites/petsc-<wbr>3.7.2/ml-6.2-p3.tar.gz --with-mumps-lib="-L/home/<wbr>niko/dev/codeaster-<wbr>prerequisites/v13/<wbr>prerequisites/Mumps-502_<wbr>consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/niko/dev/codeaster-<wbr>prerequisites/v13/<wbr>prerequisites/Scotch_aster-<wbr>604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/niko/dev/codeaster-<wbr>prerequisites/v13/<wbr>prerequisites/Parmetis_aster-<wbr>403_aster/lib -lparmetis -L/home/niko/dev/codeaster-<wbr>prerequisites/v13/<wbr>prerequisites/Metis_aster-510_<wbr>aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp " --with-mumps-include=/home/<wbr>niko/dev/codeaster-<wbr>prerequisites/v13/<wbr>prerequisites/Mumps-502_<wbr>consortium_aster1/MPI/include --with-scalapack-lib="-L/usr/<wbr>lib -lscalapack-openmpi" --with-blacs-lib="-L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi" --with-blas-lib="-L/usr/lib -lopenblas -lcblas" --with-lapack-lib="-L/usr/lib -llapack"</div><div>[0]PETSC ERROR: #1 PetscObjectGetFortranCallback(<wbr>) line 263 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>sys/objects/inherit.c</div><div>[0]PETSC ERROR: #2 oursnesjacobian() line 105 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/interface/ftn-custom/<wbr>zsnesf.c</div><div>[0]PETSC ERROR: #3 SNESComputeJacobian() line 2312 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/interface/snes.c</div><div>[0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/impls/ls/ls.c</div><div>[0]PETSC ERROR: #5 SNESSolve() line 4008 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/interface/snes.c</div><div>[0]PETSC ERROR: #6 SNESFASDownSmooth_Private() line 512 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/impls/fas/fas.c</div><div>[0]PETSC ERROR: #7 SNESFASCycle_Multiplicative() line 816 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/impls/fas/fas.c</div><div>[0]PETSC ERROR: #8 SNESSolve_FAS() line 987 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/impls/fas/fas.c</div><div>[0]PETSC ERROR: #9 SNESSolve() line 4008 in /home/niko/dev/codeaster-<wbr>prerequisites/petsc-3.7.2/src/<wbr>snes/interface/snes.c</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>--------------</div></div><div><br></div></div>
</blockquote></div><br></div>