[petsc-users] Changes on direct solver interface???

Xujun Zhao xzhao99 at gmail.com
Fri Apr 1 17:30:18 CDT 2016


No dbg version complied now, but will do now.

A first look with -info shows something is wrong when SetUpPC for
superLU_dist:

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374782

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374782

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374782

[0] MatStashScatterBegin_Private(): No of messages: 0

[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.

[1] MatStashScatterBegin_Private(): No of messages: 1

[1] MatStashScatterBegin_Private(): Mesg_to: 0: size: 28160 bytes

[1] MatAssemblyBegin_MPIAIJ(): Stash has 1760 entries, uses 0 mallocs.

[3] MatStashScatterBegin_Private(): No of messages: 2

[3] MatStashScatterBegin_Private(): Mesg_to: 0: size: 22528 bytes

[2] MatStashScatterBegin_Private(): No of messages: 2

[2] MatStashScatterBegin_Private(): Mesg_to: 0: size: 1056 bytes

[3] MatStashScatterBegin_Private(): Mesg_to: 2: size: 18656 bytes

[3] MatAssemblyBegin_MPIAIJ(): Stash has 2574 entries, uses 0 mallocs.

[2] MatStashScatterBegin_Private(): Mesg_to: 1: size: 36960 bytes

[2] MatAssemblyBegin_MPIAIJ(): Stash has 2376 entries, uses 0 mallocs.

[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 504 X 504; storage space: 0
unneeded,17634 used

[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[3] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 504) < 0.6. Do not use CompressedRow routines.

[3] MatSeqAIJCheckInode(): Found 278 nodes of 504. Limit used: 5. Using
Inode routines

[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 495 X 495; storage space: 355
unneeded,17115 used

[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 605 X 605; storage space: 1017
unneeded,22159 used

[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[2] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 495) < 0.6. Do not use CompressedRow routines.

[0] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 605) < 0.6. Do not use CompressedRow routines.

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 574 X 574; storage space: 795
unneeded,20096 used

[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[1] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 574) < 0.6. Do not use CompressedRow routines.

[2] MatSeqAIJCheckInode(): Found 261 nodes of 495. Limit used: 5. Using
Inode routines

[0] MatSeqAIJCheckInode(): Found 327 nodes of 605. Limit used: 5. Using
Inode routines

[1] MatSeqAIJCheckInode(): Found 298 nodes of 574. Limit used: 5. Using
Inode routines

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter

[0] VecScatterCreate(): General case: MPI to Seq

[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 495 X 146; storage space: 15
unneeded,1754 used

[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 40

[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 605 X 181; storage space: 30
unneeded,1703 used

[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 37

[0] MatCheckCompressedRow(): Found the ratio (num_zerorows
512)/(num_localrows 605) > 0.6. Use CompressedRow routines.

[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 574 X 140; storage space: 15
unneeded,1945 used

[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 31

[1] MatCheckCompressedRow(): Found the ratio (num_zerorows
409)/(num_localrows 574) > 0.6. Use CompressedRow routines.

[2] MatCheckCompressedRow(): Found the ratio (num_zerorows
378)/(num_localrows 495) > 0.6. Use CompressedRow routines.

[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 504 X 78; storage space: 0
unneeded,1378 used

[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 23

[3] MatCheckCompressedRow(): Found the ratio (num_zerorows
378)/(num_localrows 504) > 0.6. Use CompressedRow routines.

[0] MatStashScatterBegin_Private(): No of messages: 0

[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.

[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 605 X 605; storage space: 0
unneeded,22159 used

[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[1] MatStashScatterBegin_Private(): No of messages: 0

[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.

[0] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 605) < 0.6. Do not use CompressedRow routines.

[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 605 X 181; storage space: 0
unneeded,1703 used

[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 37

[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 574 X 574; storage space: 0
unneeded,20096 used

[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[1] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 574) < 0.6. Do not use CompressedRow routines.

[2] MatStashScatterBegin_Private(): No of messages: 0

[2] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.

[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 495 X 495; storage space: 0
unneeded,17115 used

[3] MatStashScatterBegin_Private(): No of messages: 0

[3] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.

[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 504 X 504; storage space: 0
unneeded,17634 used

[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[0] MatCheckCompressedRow(): Found the ratio (num_zerorows
512)/(num_localrows 605) > 0.6. Use CompressedRow routines.

[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 574 X 140; storage space: 0
unneeded,1945 used

[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 31

[1] MatCheckCompressedRow(): Found the ratio (num_zerorows
409)/(num_localrows 574) > 0.6. Use CompressedRow routines.

[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 59

[2] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 495) < 0.6. Do not use CompressedRow routines.

[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 495 X 146; storage space: 0
unneeded,1754 used

[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 40

[3] MatCheckCompressedRow(): Found the ratio (num_zerorows
0)/(num_localrows 504) < 0.6. Do not use CompressedRow routines.

[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 504 X 78; storage space: 0
unneeded,1378 used

[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0

[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 23

[3] MatCheckCompressedRow(): Found the ratio (num_zerorows
378)/(num_localrows 504) > 0.6. Use CompressedRow routines.

[2] MatCheckCompressedRow(): Found the ratio (num_zerorows
378)/(num_localrows 495) > 0.6. Use CompressedRow routines.

[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.

[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.

[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.

[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.

[0] PCSetUp(): Setting up PC for first time

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780

[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777

[1]PETSC ERROR:
------------------------------------------------------------------------

[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range

[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger

[1]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors

[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
run

[1]PETSC ERROR: to get more information on the crash.

[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------

[1]PETSC ERROR: Signal received

[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.

[1]PETSC ERROR: Petsc Release Version 3.6.3, unknown

[1]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
mcswl105.mcs.anl.gov by xzhao Fri Apr  1 17:26:55 2016

[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
--download-fblaslapack --download-scalapack --download-mumps
--download-superlu_dist --download-hypre --download-ml --download-metis
--download-parmetis --download-triangle --download-chaco
--download-elemental --with-debugging=0

[1]PETSC ERROR: #1 User provided function() line 0 in  unknown file

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1

[cli_1]: aborting job:

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1

[3]PETSC ERROR:
------------------------------------------------------------------------

[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range

[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger

[3]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors

[3]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
run

[3]PETSC ERROR: to get more information on the crash.

[3]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------

[3]PETSC ERROR: Signal received

[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.

[3]PETSC ERROR: Petsc Release Version 3.6.3, unknown

[3]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
mcswl105.mcs.anl.gov by xzhao Fri Apr  1 17:26:55 2016

[3]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
--download-fblaslapack --download-scalapack --download-mumps
--download-superlu_dist --download-hypre --download-ml --download-metis
--download-parmetis --download-triangle --download-chaco
--download-elemental --with-debugging=0

[3]PETSC ERROR: #1 User provided function() line 0 in  unknown file

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3

[cli_3]: aborting job:

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3

[0]PETSC ERROR:
------------------------------------------------------------------------

[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range

[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger

[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors

[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
run

[0]PETSC ERROR: to get more information on the crash.

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------

[0]PETSC ERROR: Signal received

[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.6.3, unknown

[0]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
mcswl105.mcs.anl.gov by xzhao Fri Apr  1 17:26:55 2016

[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
--download-fblaslapack --download-scalapack --download-mumps
--download-superlu_dist --download-hypre --download-ml --download-metis
--download-parmetis --download-triangle --download-chaco
--download-elemental --with-debugging=0

[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

[cli_0]: aborting job:

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

[2]PETSC ERROR:
------------------------------------------------------------------------

[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range

[2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger

[2]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors

[2]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
run

[2]PETSC ERROR: to get more information on the crash.

[2]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------

[2]PETSC ERROR: Signal received

[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.

[2]PETSC ERROR: Petsc Release Version 3.6.3, unknown

[2]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
mcswl105.mcs.anl.gov by xzhao Fri Apr  1 17:26:55 2016

[2]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
--download-fblaslapack --download-scalapack --download-mumps
--download-superlu_dist --download-hypre --download-ml --download-metis
--download-parmetis --download-triangle --download-chaco
--download-elemental --with-debugging=0

[2]PETSC ERROR: #1 User provided function() line 0 in  unknown file

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 2

[cli_2]: aborting job:

application called MPI_Abort(MPI_COMM_WORLD, 59) - process 2

On Fri, Apr 1, 2016 at 5:13 PM, Satish Balay <balay at mcs.anl.gov> wrote:

> please run the code in debugger and obtain a stack trace.
>
> Satish
>
> On Fri, 1 Apr 2016, Xujun Zhao wrote:
>
> > Hi all,
> >
> > Are there any changes on the PETSc direct solver interface, for example,
> > with superLU_dist and MUMPS. I found that my libMesh code failed with
> both
> > of them, but still works with iterative solver (GMRES by default).
> Thanks a
> > lot.
> >
> > Best,
> > Xujun
> >
> >
> --------------------------------------------------------------------------------------------------------------
> >
> > [0]PETSC ERROR:
> > ------------------------------------------------------------------------
> >
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > probably memory access out of range
> >
> > [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> >
> > [0]PETSC ERROR: or see
> > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> >
> > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X
> > to find memory corruption errors
> >
> > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
> and
> > run
> >
> > [0]PETSC ERROR: to get more information on the crash.
> >
> > [0]PETSC ERROR: --------------------- Error Message
> > --------------------------------------------------------------
> >
> > [0]PETSC ERROR: Signal received
> >
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for
> > trouble shooting.
> >
> > [0]PETSC ERROR: Petsc Release Version 3.6.3, unknown
> >
> > [0]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
> > mcswl105.mcs.anl.gov by xzhao Fri Apr  1 15:57:40 2016
> >
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> > --with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
> > --download-fblaslapack --download-scalapack --download-mumps
> > --download-superlu_dist --download-hypre --download-ml --download-metis
> > --download-parmetis --download-triangle --download-chaco
> > --download-elemental --with-debugging=0
> >
> > [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> >
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> >
> > [cli_0]: aborting job:
> >
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> >
> > [1]PETSC ERROR:
> > ------------------------------------------------------------------------
> >
> > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > probably memory access out of range
> >
> > [1]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> >
> > [1]PETSC ERROR: or see
> > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> >
> > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X
> > to find memory corruption errors
> >
> > [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
> and
> > run
> >
> > [1]PETSC ERROR: to get more information on the crash.
> >
> > [1]PETSC ERROR: --------------------- Error Message
> > --------------------------------------------------------------
> >
> > [1]PETSC ERROR: Signal received
> >
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for
> > trouble shooting.
> >
> > [1]PETSC ERROR: Petsc Release Version 3.6.3, unknown
> >
> > [1]PETSC ERROR: ./example-opt on a arch-darwin-c-opt named
> > mcswl105.mcs.anl.gov by xzhao Fri Apr  1 15:57:40 2016
> >
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> > --with-fc=gfortran --with-cxx-dialect=C++11 --download-mpich
> > --download-fblaslapack --download-scalapack --download-mumps
> > --download-superlu_dist --download-hypre --download-ml --download-metis
> > --download-parmetis --download-triangle --download-chaco
> > --download-elemental --with-debugging=0
> >
> > [1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> >
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
> >
> > [cli_1]: aborting job:
> >
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
> >
> >
> >
> -------------------------------------------------------------------------------------------------------------------
> >
> > | Processor id:   0
> >                                         |
> >
> > | Num Processors: 4
> >                                         |
> >
> > | Time:           Fri Apr  1 15:57:40 2016
> >                                         |
> >
> > | OS:             Darwin
> >                                         |
> >
> > | HostName:       mcswl105.mcs.anl.gov
> >                                         |
> >
> > | OS Release:     15.2.0
> >                                         |
> >
> > | OS Version:     Darwin Kernel Version 15.2.0: Fri Nov 13 19:56:56 PST
> > 2015; root:xnu-3248.20.55~2/RELEASE_X86_64  |
> >
> > | Machine:        x86_64
> >                                         |
> >
> > | Username:       xzhao
> >                                         |
> >
> > | Configuration:  ./configure
> > '--prefix=/Users/xzhao/software/libmesh/libmesh-dev'
> >           |
> >
> > |  '--with-methods=opt'
> >                                         |
> >
> > |  '--enable-everything'
> >                                         |
> >
> > |  '--enable-parmesh'
> >                                         |
> >
> > |  '--disable-strict-lgpl'
> >                                         |
> >
> > |  '--with-vtk-include=/usr/local/include/vtk-6.2'
> >                                         |
> >
> > |  '--with-vtk-lib=/usr/local/lib'
> >                                         |
> >
> > |  'PETSC_DIR=/Users/xzhao/software/petsc/petsc-dev'
> >                                         |
> >
> > |  'PETSC_ARCH=arch-darwin-c-opt'
> >                                         |
> >
> > |  'SLEPC_DIR=/Users/xzhao/software/slepc/slepc-3.6'
> >                                         |
> >
> >
> -------------------------------------------------------------------------------------------------------------------
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160401/3810e778/attachment-0001.html>


More information about the petsc-users mailing list