<html>
<head>
<meta content="text/html; charset=windows-1252"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<p>Hi Satish,</p>
<p>Only partially working:</p>
<p>[t00196@b04-036 tutorials]$ mpiexec -n 2 ./ex4f90<br>
jwe1050i-w The hardware barrier couldn't be used and continues
processing using the software barrier.<br>
taken to (standard) corrective action, execution continuing.<br>
jwe1050i-w The hardware barrier couldn't be used and continues
processing using the software barrier.<br>
taken to (standard) corrective action, execution continuing.<br>
Vec Object:Vec Object:initial vector:initial vector: 1 MPI
processes<br>
type: seq<br>
10<br>
20<br>
30<br>
40<br>
50<br>
60<br>
1 MPI processes<br>
type: seq<br>
10<br>
20<br>
30<br>
40<br>
50<br>
60<br>
[1]PETSC ERROR:
------------------------------------------------------------------------<br>
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
Violation, probably memory access out of range<br>
[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger<br>
[1]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
[1]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple
Mac OS X to find memory corruption errors<br>
[1]PETSC ERROR: likely location of problem given in stack below<br>
[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------<br>
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,<br>
[1]PETSC ERROR: INSTEAD the line number of the start of the
function<br>
[1]PETSC ERROR: is given.<br>
[1]PETSC ERROR: [1] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c<br>
[1]PETSC ERROR: --------------------- Error Message
------------------------------------------[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
Violation, probably memory access out of range<br>
[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger<br>
[0]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
[0]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple
Mac OS X to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------<br>
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the
function<br>
[0]PETSC ERROR: is given.<br>
[0]PETSC ERROR: [0] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c<br>
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[1]PETSC ERROR: Signal received<br>
[1]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 13:23:41 2016<br>
[1]PETSC ERROR: Configure options --with-cc=mpifcc
--with-cxx=mpiFCC --with-fc=mpifrt --with-64-bit-pointers=1
--CC=mpifcc --CFLAGS="-Xg -O0" --CXX=mpiFCC --CXXFLAGS="-Xg -O0"
--FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED= --LDDFLAGS=
--with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4<br>
[1]PETSC ERROR: #1 User provided function() line 0 in unknown
file<br>
--------------------------------------------------------------------------<br>
[mpi::mpi-api::mpi-abort]<br>
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD<br>
with errorcode 59.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]<br>
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]<br>
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
[0xffffffff1111b5ec]<br>
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
[0xffffffff00281bf0]<br>
[b04-036:28998] ./ex4f90 [0x292548]<br>
[b04-036:28998] ./ex4f90 [0x29165c]<br>
[b04-036:28998]
/opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
[0xffffffff121e1974]<br>
[b04-036:28998] ./ex4f90 [0x9f6748]<br>
[b04-036:28998] ./ex4f90 [0x9f0ea4]<br>
[b04-036:28998] ./ex4f90 [0x2c76a0]<br>
[b04-036:28998] ./ex4f90(MAIN__+0x38c) [0x10688c]<br>
[b04-036:28998] ./ex4f90(main+0xec) [0x268e91c]<br>
[b04-036:28998] /lib64/libc.so.6(__libc_start_main+0x194)
[0xffffffff138cb81c]<br>
[b04-036:28998] ./ex4f90 [0x1063ac]<br>
[1]PETSC ERROR:
------------------------------------------------------------------------<br>
[1]PETSC ERROR: Caught signal number 15 Terminate: Some process
(or the batch system) has told this process to end<br>
[1]PETSC ERROR: Tr--------------------<br>
[0]PETSC ERROR: Signal received<br>
[0]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 13:23:41 2016<br>
[0]PETSC ERROR: Configure options --with-cc=mpifcc
--with-cxx=mpiFCC --with-fc=mpifrt --with-64-bit-pointers=1
--CC=mpifcc --CFLAGS="-Xg -O0" --CXX=mpiFCC --CXXFLAGS="-Xg -O0"
--FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED= --LDDFLAGS=
--with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4<br>
[0]PETSC ERROR: #1 User provided function() line 0 in unknown
file<br>
--------------------------------------------------------------------------<br>
[mpi::mpi-api::mpi-abort]<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD<br>
with errorcode 59.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]<br>
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]<br>
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
[0xffffffff1111b5ec]<br>
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
[0xffffffff00281bf0]<br>
[b04-036:28997] ./ex4f90 [0x292548]<br>
[b04-036:28997] ./ex4f90 [0x29165c]<br>
[b04-036:28997]
/opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
[0xffffffff121e1974]<br>
[b04-036:28997] ./ex4f90 [0x9f6748]<br>
[b04-036:28997] ./ex4f90 [0x9f0ea4]<br>
[b04-036:28997] ./ex4f90 [0x2c76a0]<br>
[b04-036:28997] ./ex4f90(MAIN__+0x38c) [0x10688c]<br>
[b04-036:28997] ./ex4f90(main+0xec) [0x268e91c]<br>
[b04-036:28997] /lib64/libc.so.6(__libc_start_main+0x194)
[0xffffffff138cb81c]<br>
[b04-036:28997] ./ex4f90 [0x1063ac]<br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process
(or the batch system) has told this process to end<br>
[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger<br>
[0]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
[0]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple
Mac OS X to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------<br>
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the
function<br>
[0]PETSC ERROR: is given.<br>
[0]PETSC ERROR: [0] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c<br>
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[0]PETSC ERROR: Signal received<br>
[0]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 13:23:41 2016<br>
[0]PETSC ERROR: Configure options --with-cc=mpifcc
--with-cxx=mpiFCC --with-fc=mpifrt --with-64-bit-pointers=1
--CC=mpifcc --CFLAGS="-Xg -O0" --CXX=mpiFCC --CXXFLAGS="-Xg -O0"
--FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED= --LDDFLAGS=
--with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debuy option -start_in_debugger
or -on_error_attach_debugger<br>
[1]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
[1]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple
Mac OS X to find memory corruption errors<br>
[1]PETSC ERROR: likely location of problem given in stack below<br>
[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------<br>
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,<br>
[1]PETSC ERROR: INSTEAD the line number of the start of the
function<br>
[1]PETSC ERROR: is given.<br>
[1]PETSC ERROR: [1] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c<br>
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[1]PETSC ERROR: Signal received<br>
[1]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 13:23:41 2016<br>
[1]PETSC ERROR: Configure options --with-cc=mpifcc
--with-cxx=mpiFCC --with-fc=mpifrt --with-64-bit-pointers=1
--CC=mpifcc --CFLAGS="-Xg -O0" --CXX=mpiFCC --CXXFLAGS="-Xg -O0"
--FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED= --LDDFLAGS=
--with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4<br>
[1]PETSC ERROR: #2 User provided function() line 0 in unknown
file<br>
gging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4<br>
[0]PETSC ERROR: #2 User provided function() line 0 in unknown
file<br>
[ERR.] PLE 0019 plexec One of MPI processes was
aborted.(rank=0)(nid=0x04180034)(CODE=1938,793745140674134016,15104)<br>
[t00196@b04-036 tutorials]$<br>
[ERR.] PLE 0021 plexec The interactive job has aborted with the
signal.(sig=24)<br>
[INFO] PJM 0083 pjsub Interactive job 5211401 completed.<br>
<br>
</p>
<pre class="moz-signature" cols="72">Thank you
Yours sincerely,
TAY wee-beng</pre>
<div class="moz-cite-prefix">On 1/6/2016 12:21 PM, Satish Balay
wrote:<br>
</div>
<blockquote cite="mid:alpine.LFD.2.20.1605312321290.20719@asterix"
type="cite">
<pre wrap="">Do PETSc examples using VecGetArrayF90() work?
say src/vec/vec/examples/tutorials/ex4f90.F
Satish
On Tue, 31 May 2016, TAY wee-beng wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hi,
I'm trying to run my MPI CFD code on Japan's K computer. My code can run if I
didn't make use of the PETSc DMDAVecGetArrayF90 subroutine. If it's called
call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
I get the error below. I have no problem with my code on other clusters using
the new Intel compilers. I used to have problems with DM when using the old
Intel compilers. Now on the K computer, I'm using Fujitsu's Fortran compiler.
How can I troubleshoot?
Btw, I also tested on the ex13f90 example and it didn't work too. The error is
below.
My code error:
/* size_x,size_y,size_z 76x130x136*//*
*//* total grid size = 1343680*//*
*//* recommended cores (50k / core) = 26.87360000000000*//*
*//* 0*//*
*//* 1*//*
*//* 1*//*
*//*[3]PETSC ERROR: [1]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[1]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[1]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: likely location of problem given in stack below*//*
*//*[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[1]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[1]PETSC ERROR: is given.*//*
*//*[1]PETSC ERROR: [1] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//* 1*//*
*//*------------------------------------------------------------------------*//*
*//*[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[3]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[3]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[3]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[3]PETSC ERROR: likely location of problem given in stack below*//*
*//*[3]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[0]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[0]PETSC ERROR: likely location of problem given in stack below*//*
*//*[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[0]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[0]PETSC ERROR: is given.*//*
*//*[0]PETSC ERROR: [0] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[0]PETSC ERROR: --------------------- Error Message
----------------------------------------- 1*//*
*//*[2]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[2]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[2]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[2]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[2]PETSC ERROR: likely location of problem given in stack below*//*
*//*[2]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[2]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[2]PETSC ERROR: is given.*//*
*//*[2]PETSC ERROR: [2] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[2]PETSC ERROR: --------------------- Error Message
-----------------------------------------[3]PETSC ERROR: Note: The EXACT line
numbers in the stack are not available,*//*
*//*[3]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[3]PETSC ERROR: is given.*//*
*//*[3]PETSC ERROR: [3] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[3]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[3]PETSC ERROR: Signal received*//*
*//*[3]PETSC ERROR: See <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[3]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[3]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[3]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared----------------------*//*
*//*[0]PETSC ERROR: Signal received*//*
*//*[0]PETSC ERROR: See <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[0]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[0]PETSC ERROR: #1 User provided function() line 0 in unknown file*//*
*//*--------------------------------------------------------------------------*//*
*//*[m---------------------*//*
*//*[2]PETSC ERROR: Signal received*//*
*//*[2]PETSC ERROR: See <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[2]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[2]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[2]PETSC ERROR: #1 User provided function() line 0 in unknown file*//*
*//*--------------------------------------------------------------------------*//*
*//*[m[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Signal received*//*
*//*[1]PETSC ERROR: See <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[1]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[1]PETSC ERROR: #1 User provided function() line 0 ilibraries=0
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[3]PETSC ERROR: #1 User provided function() line 0 in unknown file*//*
*//*--------------------------------------------------------------------------*//*
*//*[mpi::mpi-api::mpi-abort]*//*
*//*MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD*//*
*//*with errorcode 59.*//*
*//*
*//*NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.*//*
*//*You may or may not see output from other processes, depending on*//*
*//*exactly when Open MPI kills them.*//*
*//*--------------------------------------------------------------------------*//*
*//*[b04-036:28416]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]*//*
*//*[b04-036:28416]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]*//*
*//*[b04-036:28416] /opt/FJSVtclang/GM-1.2.0-2pi::mpi-api::mpi-abort]*//*
*//*MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD*//*
*//*with errorcode 59.*/
ex13f90 error:
/*[t00196@b04-036 tutorials]$ mpiexec -np 2 ./ex13f90*//*
*//*jwe1050i-w The hardware barrier couldn't be used and continues processing
using the software barrier.*//*
*//*taken to (standard) corrective action, execution continuing.*//*
*//*jwe1050i-w The hardware barrier couldn't be used and continues processing
using the software barrier.*//*
*//*taken to (standard) corrective action, execution continuing.*//*
*//* Hi! We're solving van der Pol using 2 processes.*//*
*//*
*//* t x1 x2*//*
*//*[1]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
memory access*//*
*//*[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[0]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
memory access*//*
*//*[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[0]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: or see
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[1]PETSC ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: likely location of problem given in stack below*//*
*//*[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[1]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[1]PETSC ERROR: is given.*//*
*//*[1]PETSC ERROR: [1] F90Array4dCreate line 337
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[0]PETSC ERROR: likely location of problem given in stack below*//*
*//*[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[0]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[0]PETSC ERROR: is given.*//*
*//*[0]PETSC ERROR: [0] F90Array4dCreate line 337
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Signal received*//*
*//*[1]PETSC ERROR: See <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[1]PETSC ERROR: ./ex13f90 on a petsc-3.6.3_debug named b04-036 by Unknown
Wed Jun 1 13:04:34 2016*//*
*//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hyp*//*
*/
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<br>
</body>
</html>