<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=windows-1252">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<p>Hi,</p>
<p>Is there any update to the issue below?</p>
<p>No hurry, just to make sure that the email is sent successfully.<br>
</p>
<p><br>
</p>
<p>Thanks<br>
</p>
<div class="moz-forward-container"><br>
<br>
-------- Forwarded Message --------
<table class="moz-email-headers-table" border="0" cellpadding="0"
cellspacing="0">
<tbody>
<tr>
<th align="RIGHT" nowrap="nowrap" valign="BASELINE">Subject:
</th>
<td>Re: [petsc-users] Error with PETSc on K computer</td>
</tr>
<tr>
<th align="RIGHT" nowrap="nowrap" valign="BASELINE">Date: </th>
<td>Thu, 2 Jun 2016 10:25:22 +0800</td>
</tr>
<tr>
<th align="RIGHT" nowrap="nowrap" valign="BASELINE">From: </th>
<td>TAY wee-beng <a class="moz-txt-link-rfc2396E" href="mailto:zonexo@gmail.com"><zonexo@gmail.com></a></td>
</tr>
<tr>
<th align="RIGHT" nowrap="nowrap" valign="BASELINE">To: </th>
<td>petsc-users <a class="moz-txt-link-rfc2396E" href="mailto:petsc-users@mcs.anl.gov"><petsc-users@mcs.anl.gov></a></td>
</tr>
</tbody>
</table>
<br>
<br>
<meta content="text/html; charset=windows-1252"
http-equiv="Content-Type">
<p>Hi Satish,</p>
<p>The X9 option is :</p>
<p>Provides a different interpretation under Fortran 95
specifications<br>
for any parts not conforming to the language specifications of
this<br>
compiler</p>
<p>I just patched and re-compiled but it still can't work. I've
attached the configure.log for both builds.<br>
</p>
<p>FYI, some parts of the PETSc 3.6.3 code were initially patch to
make it work with the K computer system:</p>
<p>$ diff -u petsc-3.6.3/config/BuildSystem/config/package.py.org
petsc-3.6.3/config/BuildSystem/config/package.py<br>
--- petsc-3.6.3/config/BuildSystem/config/package.py.org
2015-12-04 14:06:42.000000000 +0900<br>
+++ petsc-3.6.3/config/BuildSystem/config/package.py
2016-01-22 11:09:37.000000000 +0900<br>
@@ -174,7 +174,7 @@<br>
return ''<br>
<br>
def getSharedFlag(self,cflags):<br>
- for flag in ['-PIC', '-fPIC', '-KPIC', '-qpic']:<br>
+ for flag in ['-KPIC', '-fPIC', '-PIC', '-qpic']:<br>
if cflags.find(flag) >=0: return flag<br>
return ''<br>
<br>
$ diff -u
petsc-3.6.3/config/BuildSystem/config/setCompilers.py.org
petsc-3.6.3/config/BuildSystem/config/setCompilers.py<br>
--- petsc-3.6.3/config/BuildSystem/config/setCompilers.py.org
2015-07-23 00:22:46.000000000 +0900<br>
+++ petsc-3.6.3/config/BuildSystem/config/setCompilers.py
2016-01-22 11:10:05.000000000 +0900<br>
@@ -1017,7 +1017,7 @@<br>
self.pushLanguage(language)<br>
#different compilers are sensitive to the order of
testing these flags. So separete out GCC test.<br>
if
config.setCompilers.Configure.isGNU(self.getCompiler()):
testFlags = ['-fPIC']<br>
- else: testFlags = ['-PIC', '-fPIC', '-KPIC','-qpic']<br>
+ else: testFlags = ['-KPIC', '-fPIC', '-PIC','-qpic']<br>
for testFlag in testFlags:<br>
try:<br>
self.logPrint('Trying '+language+' compiler flag
'+testFlag)<br>
$ diff -u
petsc-3.6.3/config/BuildSystem/config/packages/openmp.py.org
petsc-3.6.3/config/BuildSystem/config/packages/openmp.py<br>
---
petsc-3.6.3/config/BuildSystem/config/packages/openmp.py.org
2016-01-25 15:42:23.000000000+0900<br>
+++ petsc-3.6.3/config/BuildSystem/config/packages/openmp.py
2016-01-22 17:13:52.000000000 +0900<br>
@@ -19,7 +19,8 @@<br>
self.found = 0<br>
self.setCompilers.pushLanguage('C')<br>
#<br>
- for flag in ["-fopenmp", # Gnu<br>
+ for flag in ["-Kopenmp", # Fujitsu<br>
+ "-fopenmp", # Gnu<br>
"-qsmp=omp",# IBM XL C/C++<br>
"-h omp", # Cray. Must come after XL because
XL interprets this option as meaning"-soname omp"<br>
"-mp", # Portland Group<br>
<br>
$ diff -u
./petsc-3.6.3/config/BuildSystem/config/compilers.py.org
./petsc-3.6.3/config/BuildSystem/config/compilers.py<br>
--- ./petsc-3.6.3/config/BuildSystem/config/compilers.py.org
2015-06-10 06:24:49.000000000 +0900<br>
+++ ./petsc-3.6.3/config/BuildSystem/config/compilers.py
2016-02-19 11:56:12.000000000 +0900<br>
@@ -164,7 +164,7 @@<br>
def checkCLibraries(self):<br>
'''Determines the libraries needed to link with C'''<br>
oldFlags = self.setCompilers.LDFLAGS<br>
- self.setCompilers.LDFLAGS += ' -v'<br>
+ self.setCompilers.LDFLAGS += ' -###'<br>
self.pushLanguage('C')<br>
(output, returnCode) = self.outputLink('', '')<br>
self.setCompilers.LDFLAGS = oldFlags<br>
@@ -413,7 +413,7 @@<br>
def checkCxxLibraries(self):<br>
'''Determines the libraries needed to link with C++'''<br>
oldFlags = self.setCompilers.LDFLAGS<br>
- self.setCompilers.LDFLAGS += ' -v'<br>
+ self.setCompilers.LDFLAGS += ' -###'<br>
self.pushLanguage('Cxx')<br>
(output, returnCode) = self.outputLink('', '')<br>
self.setCompilers.LDFLAGS = oldFlags<br>
</p>
<p><br>
</p>
<p><br>
</p>
<pre class="moz-signature" cols="72">Thank you
Yours sincerely,
TAY wee-beng</pre>
<div class="moz-cite-prefix">On 2/6/2016 3:18 AM, Satish Balay
wrote:<br>
</div>
<blockquote cite="mid:alpine.LFD.2.20.1606011407060.26627@asterix"
type="cite">
<pre wrap="">What does -X9 in --FFLAGS="-X9 -O0" do?
can you send configure.log for this build?
And does the attached patch make a difference with this example?
[suggest doing a separate temporary build of PETSc - in a different source location - to check this.]
Satish
On Wed, 1 Jun 2016, TAY wee-beng wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hi Satish,
Only partially working:
[t00196@b04-036 tutorials]$ mpiexec -n 2 ./ex4f90
jwe1050i-w The hardware barrier couldn't be used and continues processing
using the software barrier.
taken to (standard) corrective action, execution continuing.
jwe1050i-w The hardware barrier couldn't be used and continues processing
using the software barrier.
taken to (standard) corrective action, execution continuing.
Vec Object:Vec Object:initial vector:initial vector: 1 MPI processes
type: seq
10
20
30
40
50
60
1 MPI processes
type: seq
10
20
30
40
50
60
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably
memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>
[1]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to
find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR: INSTEAD the line number of the start of the function
[1]PETSC ERROR: is given.
[1]PETSC ERROR: [1] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[1]PETSC ERROR: --------------------- Error Message
------------------------------------------[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably
memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>
[0]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to
find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for
trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown Wed
Jun 1 13:23:41 2016
[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[1]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
[mpi::mpi-api::mpi-abort]
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]
[b04-036:28998] /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
[0xffffffff1111b5ec]
[b04-036:28998]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
[0xffffffff00281bf0]
[b04-036:28998] ./ex4f90 [0x292548]
[b04-036:28998] ./ex4f90 [0x29165c]
[b04-036:28998] /opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
[0xffffffff121e1974]
[b04-036:28998] ./ex4f90 [0x9f6748]
[b04-036:28998] ./ex4f90 [0x9f0ea4]
[b04-036:28998] ./ex4f90 [0x2c76a0]
[b04-036:28998] ./ex4f90(MAIN__+0x38c) [0x10688c]
[b04-036:28998] ./ex4f90(main+0xec) [0x268e91c]
[b04-036:28998] /lib64/libc.so.6(__libc_start_main+0x194) [0xffffffff138cb81c]
[b04-036:28998] ./ex4f90 [0x1063ac]
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch
system) has told this process to end
[1]PETSC ERROR: Tr--------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown Wed
Jun 1 13:23:41 2016
[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[0]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
[mpi::mpi-api::mpi-abort]
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]
[b04-036:28997] /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
[0xffffffff1111b5ec]
[b04-036:28997]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
[0xffffffff00281bf0]
[b04-036:28997] ./ex4f90 [0x292548]
[b04-036:28997] ./ex4f90 [0x29165c]
[b04-036:28997] /opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
[0xffffffff121e1974]
[b04-036:28997] ./ex4f90 [0x9f6748]
[b04-036:28997] ./ex4f90 [0x9f0ea4]
[b04-036:28997] ./ex4f90 [0x2c76a0]
[b04-036:28997] ./ex4f90(MAIN__+0x38c) [0x10688c]
[b04-036:28997] ./ex4f90(main+0xec) [0x268e91c]
[b04-036:28997] /lib64/libc.so.6(__libc_start_main+0x194) [0xffffffff138cb81c]
[b04-036:28997] ./ex4f90 [0x1063ac]
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch
system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>
[0]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to
find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown Wed
Jun 1 13:23:41 2016
[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debuy option -start_in_debugger or
-on_error_attach_debugger
[1]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>
[1]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to
find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR: INSTEAD the line number of the start of the function
[1]PETSC ERROR: is given.
[1]PETSC ERROR: [1] F90Array1dCreate line 50
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for
trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown Wed
Jun 1 13:23:41 2016
[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[1]PETSC ERROR: #2 User provided function() line 0 in unknown file
gging=1 --useThreads=0 --with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[0]PETSC ERROR: #2 User provided function() line 0 in unknown file
[ERR.] PLE 0019 plexec One of MPI processes was
aborted.(rank=0)(nid=0x04180034)(CODE=1938,793745140674134016,15104)
[t00196@b04-036 tutorials]$
[ERR.] PLE 0021 plexec The interactive job has aborted with the
signal.(sig=24)
[INFO] PJM 0083 pjsub Interactive job 5211401 completed.
Thank you
Yours sincerely,
TAY wee-beng
On 1/6/2016 12:21 PM, Satish Balay wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Do PETSc examples using VecGetArrayF90() work?
say src/vec/vec/examples/tutorials/ex4f90.F
Satish
On Tue, 31 May 2016, TAY wee-beng wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hi,
I'm trying to run my MPI CFD code on Japan's K computer. My code can run
if I
didn't make use of the PETSc DMDAVecGetArrayF90 subroutine. If it's called
call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
I get the error below. I have no problem with my code on other clusters
using
the new Intel compilers. I used to have problems with DM when using the
old
Intel compilers. Now on the K computer, I'm using Fujitsu's Fortran
compiler.
How can I troubleshoot?
Btw, I also tested on the ex13f90 example and it didn't work too. The
error is
below.
My code error:
/* size_x,size_y,size_z 76x130x136*//*
*//* total grid size = 1343680*//*
*//* recommended cores (50k / core) = 26.87360000000000*//*
*//* 0*//*
*//* 1*//*
*//* 1*//*
*//*[3]PETSC ERROR: [1]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[1]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[1]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: likely location of problem given in stack below*//*
*//*[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[1]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[1]PETSC ERROR: is given.*//*
*//*[1]PETSC ERROR: [1] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//* 1*//*
*//*------------------------------------------------------------------------*//*
*//*[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[3]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[3]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[3]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[3]PETSC ERROR: likely location of problem given in stack below*//*
*//*[3]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[0]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[0]PETSC ERROR: likely location of problem given in stack below*//*
*//*[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[0]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[0]PETSC ERROR: is given.*//*
*//*[0]PETSC ERROR: [0] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[0]PETSC ERROR: --------------------- Error Message
----------------------------------------- 1*//*
*//*[2]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range*//*
*//*[2]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[2]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[2]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[2]PETSC ERROR: likely location of problem given in stack below*//*
*//*[2]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[2]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[2]PETSC ERROR: is given.*//*
*//*[2]PETSC ERROR: [2] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[2]PETSC ERROR: --------------------- Error Message
-----------------------------------------[3]PETSC ERROR: Note: The EXACT
line
numbers in the stack are not available,*//*
*//*[3]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[3]PETSC ERROR: is given.*//*
*//*[3]PETSC ERROR: [3] F90Array3dCreate line 244
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[3]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[3]PETSC ERROR: Signal received*//*
*//*[3]PETSC ERROR: See
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[3]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[3]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[3]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
--LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared----------------------*//*
*//*[0]PETSC ERROR: Signal received*//*
*//*[0]PETSC ERROR: See
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[0]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
--LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[0]PETSC ERROR: #1 User provided function() line 0 in unknown
file*//*
*//*--------------------------------------------------------------------------*//*
*//*[m---------------------*//*
*//*[2]PETSC ERROR: Signal received*//*
*//*[2]PETSC ERROR: See
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[2]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[2]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
--LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[2]PETSC ERROR: #1 User provided function() line 0 in unknown
file*//*
*//*--------------------------------------------------------------------------*//*
*//*[m[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Signal received*//*
*//*[1]PETSC ERROR: See
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[1]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
Unknown Wed Jun 1 12:54:34 2016*//*
*//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
--LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[1]PETSC ERROR: #1 User provided function() line 0 ilibraries=0
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
*//*[3]PETSC ERROR: #1 User provided function() line 0 in unknown
file*//*
*//*--------------------------------------------------------------------------*//*
*//*[mpi::mpi-api::mpi-abort]*//*
*//*MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD*//*
*//*with errorcode 59.*//*
*//*
*//*NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.*//*
*//*You may or may not see output from other processes, depending on*//*
*//*exactly when Open MPI kills them.*//*
*//*--------------------------------------------------------------------------*//*
*//*[b04-036:28416]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
[0xffffffff11360404]*//*
*//*[b04-036:28416]
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
[0xffffffff1110391c]*//*
*//*[b04-036:28416] /opt/FJSVtclang/GM-1.2.0-2pi::mpi-api::mpi-abort]*//*
*//*MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD*//*
*//*with errorcode 59.*/
ex13f90 error:
/*[t00196@b04-036 tutorials]$ mpiexec -np 2 ./ex13f90*//*
*//*jwe1050i-w The hardware barrier couldn't be used and continues
processing
using the software barrier.*//*
*//*taken to (standard) corrective action, execution continuing.*//*
*//*jwe1050i-w The hardware barrier couldn't be used and continues
processing
using the software barrier.*//*
*//*taken to (standard) corrective action, execution continuing.*//*
*//* Hi! We're solving van der Pol using 2 processes.*//*
*//*
*//* t x1 x2*//*
*//*[1]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly
illegal
memory access*//*
*//*[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR:
------------------------------------------------------------------------*//*
*//*[0]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly
illegal
memory access*//*
*//*[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger*//*
*//*[0]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[0]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: or see
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*</a>
*//*[1]PETSC ERROR: or try <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac
OS X
to find memory corruption errors*//*
*//*[1]PETSC ERROR: likely location of problem given in stack below*//*
*//*[1]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[1]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[1]PETSC ERROR: is given.*//*
*//*[1]PETSC ERROR: [1] F90Array4dCreate line 337
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[0]PETSC ERROR: likely location of problem given in stack below*//*
*//*[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------*//*
*//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,*//*
*//*[0]PETSC ERROR: INSTEAD the line number of the start of the
function*//*
*//*[0]PETSC ERROR: is given.*//*
*//*[0]PETSC ERROR: [0] F90Array4dCreate line 337
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
*//*[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*//*
*//*[1]PETSC ERROR: Signal received*//*
*//*[1]PETSC ERROR: See
<a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.*//*
*//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
*//*[1]PETSC ERROR: ./ex13f90 on a petsc-3.6.3_debug named b04-036 by
Unknown
Wed Jun 1 13:04:34 2016*//*
*//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
--LD_SHARED=
--LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
--with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
--with-scalapack-lib=-SCALAPACK
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
--with-hypre=1
--with-hyp*//*
*/
</pre>
</blockquote>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
</blockquote>
<br>
</div>
</body>
</html>