[petsc-users] Fwd: Re: Error with PETSc on K computer

Satish Balay balay at mcs.anl.gov
Fri Jun 3 09:33:21 CDT 2016


Sorry - I'm not sure whats hapenning with this compiler.

[for a build without the patch I sent ] - can you edit
PETSC_ARCH/include/petscconf.h and remove the lines

#ifndef PETSC_HAVE_F90_2PTR_ARG
#define PETSC_HAVE_F90_2PTR_ARG 1
#endif

And then build the libraries [do not run configure again].

Does this make a difference for this example?

Satish

On Fri, 3 Jun 2016, TAY wee-beng wrote:

> Hi,
> 
> Is there any update to the issue below?
> 
> No hurry, just to make sure that the email is sent successfully.
> 
> 
> Thanks
> 
> 
> 
> -------- Forwarded Message --------
> Subject: 	Re: [petsc-users] Error with PETSc on K computer
> Date: 	Thu, 2 Jun 2016 10:25:22 +0800
> From: 	TAY wee-beng <zonexo at gmail.com>
> To: 	petsc-users <petsc-users at mcs.anl.gov>
> 
> 
> 
> Hi Satish,
> 
> The X9 option is :
> 
> Provides a different interpretation under Fortran 95 specifications
> for any parts not conforming to the language specifications of this
> compiler
> 
> I just patched and re-compiled but it still can't work. I've attached the
> configure.log for both builds.
> 
> FYI, some parts of the PETSc 3.6.3 code were initially patch to make it work
> with the K computer system:
> 
> $ diff -u petsc-3.6.3/config/BuildSystem/config/package.py.org
> petsc-3.6.3/config/BuildSystem/config/package.py
> --- petsc-3.6.3/config/BuildSystem/config/package.py.org 2015-12-04
> 14:06:42.000000000 +0900
> +++ petsc-3.6.3/config/BuildSystem/config/package.py 2016-01-22
> 11:09:37.000000000 +0900
> @@ -174,7 +174,7 @@
>      return ''
> 
>    def getSharedFlag(self,cflags):
> -    for flag in ['-PIC', '-fPIC', '-KPIC', '-qpic']:
> +    for flag in ['-KPIC', '-fPIC', '-PIC', '-qpic']:
>        if cflags.find(flag) >=0: return flag
>      return ''
> 
> $ diff -u petsc-3.6.3/config/BuildSystem/config/setCompilers.py.org
> petsc-3.6.3/config/BuildSystem/config/setCompilers.py
> --- petsc-3.6.3/config/BuildSystem/config/setCompilers.py.org 2015-07-23
> 00:22:46.000000000 +0900
> +++ petsc-3.6.3/config/BuildSystem/config/setCompilers.py 2016-01-22
> 11:10:05.000000000 +0900
> @@ -1017,7 +1017,7 @@
>        self.pushLanguage(language)
>        #different compilers are sensitive to the order of testing these 
> flags. So separete out GCC test.
>        if config.setCompilers.Configure.isGNU(self.getCompiler()): testFlags =
> ['-fPIC']
> -      else: testFlags = ['-PIC', '-fPIC', '-KPIC','-qpic']
> +      else: testFlags = ['-KPIC', '-fPIC', '-PIC','-qpic']
>        for testFlag in testFlags:
>          try:
>            self.logPrint('Trying '+language+' compiler flag '+testFlag)
> $ diff -u petsc-3.6.3/config/BuildSystem/config/packages/openmp.py.org
> petsc-3.6.3/config/BuildSystem/config/packages/openmp.py
> --- petsc-3.6.3/config/BuildSystem/config/packages/openmp.py.org 2016-01-25
> 15:42:23.000000000+0900
> +++ petsc-3.6.3/config/BuildSystem/config/packages/openmp.py 2016-01-22
> 17:13:52.000000000 +0900
> @@ -19,7 +19,8 @@
>      self.found = 0
>      self.setCompilers.pushLanguage('C')
>      #
> -    for flag in ["-fopenmp", # Gnu
> +    for flag in ["-Kopenmp", # Fujitsu
> +                 "-fopenmp", # Gnu
>                   "-qsmp=omp",# IBM XL C/C++
>                   "-h omp",   # Cray. Must come after XL because XL 
> interprets this option as meaning"-soname omp"
>                   "-mp",      # Portland Group
> 
> $ diff -u ./petsc-3.6.3/config/BuildSystem/config/compilers.py.org
> ./petsc-3.6.3/config/BuildSystem/config/compilers.py
> --- ./petsc-3.6.3/config/BuildSystem/config/compilers.py.org 2015-06-10
> 06:24:49.000000000 +0900
> +++ ./petsc-3.6.3/config/BuildSystem/config/compilers.py 2016-02-19
> 11:56:12.000000000 +0900
> @@ -164,7 +164,7 @@
>    def checkCLibraries(self):
>      '''Determines the libraries needed to link with C'''
>      oldFlags = self.setCompilers.LDFLAGS
> -    self.setCompilers.LDFLAGS += ' -v'
> +    self.setCompilers.LDFLAGS += ' -###'
>      self.pushLanguage('C')
>      (output, returnCode) = self.outputLink('', '')
>      self.setCompilers.LDFLAGS = oldFlags
> @@ -413,7 +413,7 @@
>    def checkCxxLibraries(self):
>      '''Determines the libraries needed to link with C++'''
>      oldFlags = self.setCompilers.LDFLAGS
> -    self.setCompilers.LDFLAGS += ' -v'
> +    self.setCompilers.LDFLAGS += ' -###'
>      self.pushLanguage('Cxx')
>      (output, returnCode) = self.outputLink('', '')
>      self.setCompilers.LDFLAGS = oldFlags
> 
> 
> 
> Thank you
> 
> Yours sincerely,
> 
> TAY wee-beng
> 
> On 2/6/2016 3:18 AM, Satish Balay wrote:
> > What does -X9 in  --FFLAGS="-X9 -O0" do?
> >
> > can you send configure.log for this build?
> >
> > And does the attached patch make a difference with this example?
> > [suggest doing a separate temporary build of PETSc - in a different source
> > location - to check this.]
> >
> > Satish
> >
> > On Wed, 1 Jun 2016, TAY wee-beng wrote:
> >
> > > Hi Satish,
> > >
> > > Only partially working:
> > >
> > > [t00196 at b04-036 tutorials]$ mpiexec -n 2 ./ex4f90
> > > jwe1050i-w The hardware barrier couldn't be used and continues processing
> > > using the software barrier.
> > > taken to (standard) corrective action, execution continuing.
> > > jwe1050i-w The hardware barrier couldn't be used and continues processing
> > > using the software barrier.
> > > taken to (standard) corrective action, execution continuing.
> > > Vec Object:Vec Object:initial vector:initial vector: 1 MPI processes
> > >    type: seq
> > > 10
> > > 20
> > > 30
> > > 40
> > > 50
> > > 60
> > >   1 MPI processes
> > >    type: seq
> > > 10
> > > 20
> > > 30
> > > 40
> > > 50
> > > 60
> > > [1]PETSC ERROR:
> > > ------------------------------------------------------------------------
> > > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > > probably
> > > memory access out of range
> > > [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > > [1]PETSC ERROR: or see
> > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > > [1]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple Mac OS X
> > > to
> > > find memory corruption errors
> > > [1]PETSC ERROR: likely location of problem given in stack below
> > > [1]PETSC ERROR: ---------------------  Stack Frames
> > > ------------------------------------
> > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > available,
> > > [1]PETSC ERROR:       INSTEAD the line number of the start of the function
> > > [1]PETSC ERROR:       is given.
> > > [1]PETSC ERROR: [1] F90Array1dCreate line 50
> > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
> > > [1]PETSC ERROR: --------------------- Error Message
> > > ------------------------------------------[0]PETSC ERROR:
> > > ------------------------------------------------------------------------
> > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > > probably
> > > memory access out of range
> > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > > [0]PETSC ERROR: or see
> > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > > [0]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple Mac OS X
> > > to
> > > find memory corruption errors
> > > [0]PETSC ERROR: likely location of problem given in stack below
> > > [0]PETSC ERROR: ---------------------  Stack Frames
> > > ------------------------------------
> > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > available,
> > > [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> > > [0]PETSC ERROR:       is given.
> > > [0]PETSC ERROR: [0] F90Array1dCreate line 50
> > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
> > > [0]PETSC ERROR: --------------------- Error Message
> > > --------------------------------------------------------------
> > > [1]PETSC ERROR: Signal received
> > > [1]PETSC ERROR: Seehttp://www.mcs.anl.gov/petsc/documentation/faq.html
> > > for
> > > trouble shooting.
> > > [1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > > [1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown
> > > Wed
> > > Jun  1 13:23:41 2016
> > > [1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > --LD_SHARED=
> > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > --with-scalapack-lib=-SCALAPACK
> > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > --with-hypre=1
> > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
> > > [1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> > > --------------------------------------------------------------------------
> > > [mpi::mpi-api::mpi-abort]
> > > MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> > > with errorcode 59.
> > >
> > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > > You may or may not see output from other processes, depending on
> > > exactly when Open MPI kills them.
> > > --------------------------------------------------------------------------
> > > [b04-036:28998]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
> > > [0xffffffff11360404]
> > > [b04-036:28998]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
> > > [0xffffffff1110391c]
> > > [b04-036:28998]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
> > > [0xffffffff1111b5ec]
> > > [b04-036:28998]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
> > > [0xffffffff00281bf0]
> > > [b04-036:28998] ./ex4f90 [0x292548]
> > > [b04-036:28998] ./ex4f90 [0x29165c]
> > > [b04-036:28998]
> > > /opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
> > > [0xffffffff121e1974]
> > > [b04-036:28998] ./ex4f90 [0x9f6748]
> > > [b04-036:28998] ./ex4f90 [0x9f0ea4]
> > > [b04-036:28998] ./ex4f90 [0x2c76a0]
> > > [b04-036:28998] ./ex4f90(MAIN__+0x38c) [0x10688c]
> > > [b04-036:28998] ./ex4f90(main+0xec) [0x268e91c]
> > > [b04-036:28998] /lib64/libc.so.6(__libc_start_main+0x194)
> > > [0xffffffff138cb81c]
> > > [b04-036:28998] ./ex4f90 [0x1063ac]
> > > [1]PETSC ERROR:
> > > ------------------------------------------------------------------------
> > > [1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the
> > > batch
> > > system) has told this process to end
> > > [1]PETSC ERROR: Tr--------------------
> > > [0]PETSC ERROR: Signal received
> > > [0]PETSC ERROR: Seehttp://www.mcs.anl.gov/petsc/documentation/faq.html
> > > for
> > > trouble shooting.
> > > [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > > [0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown
> > > Wed
> > > Jun  1 13:23:41 2016
> > > [0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > --LD_SHARED=
> > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > --with-scalapack-lib=-SCALAPACK
> > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > --with-hypre=1
> > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
> > > [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> > > --------------------------------------------------------------------------
> > > [mpi::mpi-api::mpi-abort]
> > > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> > > with errorcode 59.
> > >
> > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > > You may or may not see output from other processes, depending on
> > > exactly when Open MPI kills them.
> > > --------------------------------------------------------------------------
> > > [b04-036:28997]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
> > > [0xffffffff11360404]
> > > [b04-036:28997]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
> > > [0xffffffff1110391c]
> > > [b04-036:28997]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c)
> > > [0xffffffff1111b5ec]
> > > [b04-036:28997]
> > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c)
> > > [0xffffffff00281bf0]
> > > [b04-036:28997] ./ex4f90 [0x292548]
> > > [b04-036:28997] ./ex4f90 [0x29165c]
> > > [b04-036:28997]
> > > /opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c)
> > > [0xffffffff121e1974]
> > > [b04-036:28997] ./ex4f90 [0x9f6748]
> > > [b04-036:28997] ./ex4f90 [0x9f0ea4]
> > > [b04-036:28997] ./ex4f90 [0x2c76a0]
> > > [b04-036:28997] ./ex4f90(MAIN__+0x38c) [0x10688c]
> > > [b04-036:28997] ./ex4f90(main+0xec) [0x268e91c]
> > > [b04-036:28997] /lib64/libc.so.6(__libc_start_main+0x194)
> > > [0xffffffff138cb81c]
> > > [b04-036:28997] ./ex4f90 [0x1063ac]
> > > [0]PETSC ERROR:
> > > ------------------------------------------------------------------------
> > > [0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the
> > > batch
> > > system) has told this process to end
> > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > > [0]PETSC ERROR: or see
> > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > > [0]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple Mac OS X
> > > to
> > > find memory corruption errors
> > > [0]PETSC ERROR: likely location of problem given in stack below
> > > [0]PETSC ERROR: ---------------------  Stack Frames
> > > ------------------------------------
> > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > available,
> > > [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> > > [0]PETSC ERROR:       is given.
> > > [0]PETSC ERROR: [0] F90Array1dCreate line 50
> > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
> > > [0]PETSC ERROR: --------------------- Error Message
> > > --------------------------------------------------------------
> > > [0]PETSC ERROR: Signal received
> > > [0]PETSC ERROR: Seehttp://www.mcs.anl.gov/petsc/documentation/faq.html
> > > for
> > > trouble shooting.
> > > [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > > [0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown
> > > Wed
> > > Jun  1 13:23:41 2016
> > > [0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > --LD_SHARED=
> > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > --with-scalapack-lib=-SCALAPACK
> > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > --with-fortran-interfaces=1 --with-debuy option -start_in_debugger or
> > > -on_error_attach_debugger
> > > [1]PETSC ERROR: or see
> > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > > [1]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple Mac OS X
> > > to
> > > find memory corruption errors
> > > [1]PETSC ERROR: likely location of problem given in stack below
> > > [1]PETSC ERROR: ---------------------  Stack Frames
> > > ------------------------------------
> > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > available,
> > > [1]PETSC ERROR:       INSTEAD the line number of the start of the function
> > > [1]PETSC ERROR:       is given.
> > > [1]PETSC ERROR: [1] F90Array1dCreate line 50
> > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
> > > [1]PETSC ERROR: --------------------- Error Message
> > > --------------------------------------------------------------
> > > [1]PETSC ERROR: Signal received
> > > [1]PETSC ERROR: Seehttp://www.mcs.anl.gov/petsc/documentation/faq.html
> > > for
> > > trouble shooting.
> > > [1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > > [1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown
> > > Wed
> > > Jun  1 13:23:41 2016
> > > [1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > --LD_SHARED=
> > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > --with-scalapack-lib=-SCALAPACK
> > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > --with-hypre=1
> > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
> > > [1]PETSC ERROR: #2 User provided function() line 0 in  unknown file
> > > gging=1 --useThreads=0 --with-hypre=1
> > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
> > > [0]PETSC ERROR: #2 User provided function() line 0 in  unknown file
> > > [ERR.] PLE 0019 plexec One of MPI processes was
> > > aborted.(rank=0)(nid=0x04180034)(CODE=1938,793745140674134016,15104)
> > > [t00196 at b04-036 tutorials]$
> > > [ERR.] PLE 0021 plexec The interactive job has aborted with the
> > > signal.(sig=24)
> > > [INFO] PJM 0083 pjsub Interactive job 5211401 completed.
> > >
> > > Thank you
> > >
> > > Yours sincerely,
> > >
> > > TAY wee-beng
> > >
> > > On 1/6/2016 12:21 PM, Satish Balay wrote:
> > > > Do PETSc examples using VecGetArrayF90() work?
> > > >
> > > > say src/vec/vec/examples/tutorials/ex4f90.F
> > > >
> > > > Satish
> > > >
> > > > On Tue, 31 May 2016, TAY wee-beng wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I'm trying to run my MPI CFD code on Japan's K computer. My code can
> > > > > run
> > > > > if I
> > > > > didn't make use of the PETSc DMDAVecGetArrayF90 subroutine. If it's
> > > > > called
> > > > >
> > > > > call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
> > > > >
> > > > > I get the error below.  I have no problem with my code on other
> > > > > clusters
> > > > > using
> > > > > the new Intel compilers. I used to have problems with DM when using
> > > > > the
> > > > > old
> > > > > Intel compilers. Now on the K computer, I'm using Fujitsu's Fortran
> > > > > compiler.
> > > > > How can I troubleshoot?
> > > > >
> > > > > Btw, I also tested on the ex13f90 example and it didn't work too. The
> > > > > error is
> > > > > below.
> > > > >
> > > > >
> > > > > My code error:
> > > > >
> > > > > /* size_x,size_y,size_z 76x130x136*//*
> > > > > *//* total grid size =  1343680*//*
> > > > > *//* recommended cores (50k / core) =  26.87360000000000*//*
> > > > > *//* 0*//*
> > > > > *//* 1*//*
> > > > > *//* 1*//*
> > > > > *//*[3]PETSC ERROR: [1]PETSC ERROR:
> > > > > ------------------------------------------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> > > > > Violation,
> > > > > probably memory access out of range*//*
> > > > > *//*[1]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[1]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[1]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[1]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[1]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > > > available,*//*
> > > > > *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[1]PETSC ERROR:       is given.*//*
> > > > > *//*[1]PETSC ERROR: [1] F90Array3dCreate line 244
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//* 1*//*
> > > > > *//*------------------------------------------------------------------------*//*
> > > > > *//*[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> > > > > Violation,
> > > > > probably memory access out of range*//*
> > > > > *//*[3]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[3]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[3]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[3]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[3]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[0]PETSC ERROR:
> > > > > ------------------------------------------------------------------------*//*
> > > > > *//*[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> > > > > Violation,
> > > > > probably memory access out of range*//*
> > > > > *//*[0]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[0]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[0]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[0]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[0]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > > > available,*//*
> > > > > *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[0]PETSC ERROR:       is given.*//*
> > > > > *//*[0]PETSC ERROR: [0] F90Array3dCreate line 244
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//*[0]PETSC ERROR: --------------------- Error Message
> > > > > ----------------------------------------- 1*//*
> > > > > *//*[2]PETSC ERROR:
> > > > > ------------------------------------------------------------------------*//*
> > > > > *//*[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> > > > > Violation,
> > > > > probably memory access out of range*//*
> > > > > *//*[2]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[2]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[2]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[2]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[2]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > > > available,*//*
> > > > > *//*[2]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[2]PETSC ERROR:       is given.*//*
> > > > > *//*[2]PETSC ERROR: [2] F90Array3dCreate line 244
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//*[2]PETSC ERROR: --------------------- Error Message
> > > > > -----------------------------------------[3]PETSC ERROR: Note: The
> > > > > EXACT
> > > > > line
> > > > > numbers in the stack are not available,*//*
> > > > > *//*[3]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[3]PETSC ERROR:       is given.*//*
> > > > > *//*[3]PETSC ERROR: [3] F90Array3dCreate line 244
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//*[3]PETSC ERROR: --------------------- Error Message
> > > > > --------------------------------------------------------------*//*
> > > > > *//*[3]PETSC ERROR: Signal received*//*
> > > > > *//*[3]PETSC ERROR: See
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > > > for trouble shooting.*//*
> > > > > *//*[3]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> > > > > *//*[3]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036
> > > > > by
> > > > > Unknown Wed Jun  1 12:54:34 2016*//*
> > > > > *//*[3]PETSC ERROR: Configure options --with-cc=mpifcc
> > > > > --with-cxx=mpiFCC
> > > > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg
> > > > > -O0"
> > > > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > > > --LD_SHARED=
> > > > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > > > --with-shared----------------------*//*
> > > > > *//*[0]PETSC ERROR: Signal received*//*
> > > > > *//*[0]PETSC ERROR: See
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > > > for trouble shooting.*//*
> > > > > *//*[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> > > > > *//*[0]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036
> > > > > by
> > > > > Unknown Wed Jun  1 12:54:34 2016*//*
> > > > > *//*[0]PETSC ERROR: Configure options --with-cc=mpifcc
> > > > > --with-cxx=mpiFCC
> > > > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg
> > > > > -O0"
> > > > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > > > --LD_SHARED=
> > > > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > > > --with-scalapack-lib=-SCALAPACK
> > > > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > > > --with-hypre=1
> > > > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> > > > > *//*[0]PETSC ERROR: #1 User provided function() line 0 in  unknown
> > > > > file*//*
> > > > > *//*--------------------------------------------------------------------------*//*
> > > > > *//*[m---------------------*//*
> > > > > *//*[2]PETSC ERROR: Signal received*//*
> > > > > *//*[2]PETSC ERROR: See
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > > > for trouble shooting.*//*
> > > > > *//*[2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> > > > > *//*[2]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036
> > > > > by
> > > > > Unknown Wed Jun  1 12:54:34 2016*//*
> > > > > *//*[2]PETSC ERROR: Configure options --with-cc=mpifcc
> > > > > --with-cxx=mpiFCC
> > > > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg
> > > > > -O0"
> > > > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > > > --LD_SHARED=
> > > > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > > > --with-scalapack-lib=-SCALAPACK
> > > > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > > > --with-hypre=1
> > > > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> > > > > *//*[2]PETSC ERROR: #1 User provided function() line 0 in  unknown
> > > > > file*//*
> > > > > *//*--------------------------------------------------------------------------*//*
> > > > > *//*[m[1]PETSC ERROR: --------------------- Error Message
> > > > > --------------------------------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Signal received*//*
> > > > > *//*[1]PETSC ERROR: See
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > > > for trouble shooting.*//*
> > > > > *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> > > > > *//*[1]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036
> > > > > by
> > > > > Unknown Wed Jun  1 12:54:34 2016*//*
> > > > > *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc
> > > > > --with-cxx=mpiFCC
> > > > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg
> > > > > -O0"
> > > > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > > > --LD_SHARED=
> > > > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > > > --with-scalapack-lib=-SCALAPACK
> > > > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > > > --with-hypre=1
> > > > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> > > > > *//*[1]PETSC ERROR: #1 User provided function() line 0 ilibraries=0
> > > > > --with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK
> > > > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > > > --with-hypre=1
> > > > > --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> > > > > *//*[3]PETSC ERROR: #1 User provided function() line 0 in  unknown
> > > > > file*//*
> > > > > *//*--------------------------------------------------------------------------*//*
> > > > > *//*[mpi::mpi-api::mpi-abort]*//*
> > > > > *//*MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD*//*
> > > > > *//*with errorcode 59.*//*
> > > > > *//*
> > > > > *//*NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
> > > > > processes.*//*
> > > > > *//*You may or may not see output from other processes, depending
> > > > > on*//*
> > > > > *//*exactly when Open MPI kills them.*//*
> > > > > *//*--------------------------------------------------------------------------*//*
> > > > > *//*[b04-036:28416]
> > > > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
> > > > > [0xffffffff11360404]*//*
> > > > > *//*[b04-036:28416]
> > > > > /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
> > > > > [0xffffffff1110391c]*//*
> > > > > *//*[b04-036:28416]
> > > > > /opt/FJSVtclang/GM-1.2.0-2pi::mpi-api::mpi-abort]*//*
> > > > > *//*MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD*//*
> > > > > *//*with errorcode 59.*/
> > > > >
> > > > > ex13f90 error:
> > > > >
> > > > >
> > > > > /*[t00196 at b04-036 tutorials]$ mpiexec -np 2 ./ex13f90*//*
> > > > > *//*jwe1050i-w The hardware barrier couldn't be used and continues
> > > > > processing
> > > > > using the software barrier.*//*
> > > > > *//*taken to (standard) corrective action, execution continuing.*//*
> > > > > *//*jwe1050i-w The hardware barrier couldn't be used and continues
> > > > > processing
> > > > > using the software barrier.*//*
> > > > > *//*taken to (standard) corrective action, execution continuing.*//*
> > > > > *//* Hi! We're solving van der Pol using  2 processes.*//*
> > > > > *//*
> > > > > *//*   t     x1         x2*//*
> > > > > *//*[1]PETSC ERROR:
> > > > > ------------------------------------------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly
> > > > > illegal
> > > > > memory access*//*
> > > > > *//*[1]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[0]PETSC ERROR:
> > > > > ------------------------------------------------------------------------*//*
> > > > > *//*[0]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly
> > > > > illegal
> > > > > memory access*//*
> > > > > *//*[0]PETSC ERROR: Try option -start_in_debugger or
> > > > > -on_error_attach_debugger*//*
> > > > > *//*[0]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[0]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[1]PETSC ERROR: or see
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> > > > > *//*[1]PETSC ERROR: or tryhttp://valgrind.org  on GNU/linux and Apple
> > > > > Mac
> > > > > OS X
> > > > > to find memory corruption errors*//*
> > > > > *//*[1]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[1]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > > > available,*//*
> > > > > *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[1]PETSC ERROR:       is given.*//*
> > > > > *//*[1]PETSC ERROR: [1] F90Array4dCreate line 337
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//*[0]PETSC ERROR: likely location of problem given in stack
> > > > > below*//*
> > > > > *//*[0]PETSC ERROR: ---------------------  Stack Frames
> > > > > ------------------------------------*//*
> > > > > *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > > > > available,*//*
> > > > > *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
> > > > > function*//*
> > > > > *//*[0]PETSC ERROR:       is given.*//*
> > > > > *//*[0]PETSC ERROR: [0] F90Array4dCreate line 337
> > > > > /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> > > > > *//*[1]PETSC ERROR: --------------------- Error Message
> > > > > --------------------------------------------------------------*//*
> > > > > *//*[1]PETSC ERROR: Signal received*//*
> > > > > *//*[1]PETSC ERROR: See
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > > > for trouble shooting.*//*
> > > > > *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> > > > > *//*[1]PETSC ERROR: ./ex13f90 on a petsc-3.6.3_debug named b04-036 by
> > > > > Unknown
> > > > > Wed Jun  1 13:04:34 2016*//*
> > > > > *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc
> > > > > --with-cxx=mpiFCC
> > > > > --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg
> > > > > -O0"
> > > > > --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0"
> > > > > --LD_SHARED=
> > > > > --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> > > > > --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> > > > > --with-scalapack-lib=-SCALAPACK
> > > > > --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> > > > > --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0
> > > > > --with-hypre=1
> > > > > --with-hyp*//*
> > > > > */
> > > > >
> > > > >
> > > > >
> > >
> 
> 



More information about the petsc-users mailing list