[petsc-dev] Error when calling PETSc

Hector E Barrios Molano hectorb at utexas.edu
Mon Jun 22 19:33:45 CDT 2015


Hi Satish,

I run this program with petsc-master on Linux with Valgrind and idb.

Here is how I am calling PETSc:

          call MatCreateBAIJ(MPI_COMM_WORLD,nblock/2,N_mat,N_mat,
     >         PETSC_DECIDE,PETSC_DECIDE,PETSC_DEFAULT_INTEGER,PNNZD,
     >         PETSC_DEFAULT_INTEGER,PNNZO,mat,ierr)
          call MatSetup(mat,ierr)
          call MatSetOption(mat, MAT_NEW_NONZERO_ALLOCATION_ERR,
     >         PETSC_FALSE)

Any idea of this error?, both idb and valgrind points to matrixf.c:890

Thanks,

Here is idb output:

matsetoption_ (mat=0x36e1e48, op=0x32c2a20, flg=0x32c296c,
__ierr=0x32c296c) at
/home/hector/dwnld_prog/petsc-master/src/mat/interface/ftn-auto/matrixf.c:890
890    *__ierr = MatSetOption(


And this is the valgrind output:

$ valgrind --track-origins=yes --leak-check=full ./prog -d 3comp.dat
==18555== Memcheck, a memory error detector
==18555== Copyright (C) 2002-2013, and GNU GPL'd, by Julian Seward et al.
==18555== Using Valgrind-3.10.1 and LibVEX; rerun with -h for copyright info
==18555== Command: ./prog -d 3comp.dat
==18555==
==18555== Warning: set address range perms: large range [0x36fc000,
0x2e2b7000) (defined)
 NUMPRC:            1
  PSTD, TRSTD =    14.6960000000000        519.670000000000       in INFLUID

       TIME         TIMESTEP    ITERATION   FLASH_RES     VOLUME_RES
    MASS_RES      WATER_RES
 going into SOLVERS
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See
http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.6-34-gc9ff9ea  GIT
Date: 2015-06-17 23:52:21 -0500
[0]PETSC ERROR: ./prog on a linux-intel named petros.cpge.utexas.edu
by hector Mon Jun 22 19:25:59 2015
[0]PETSC ERROR: Configure options
--prefix=/home/hector/installed/petsc3.6-intel
--PETSC_ARCH=linux-intel
--with-mpi-dir=/home/hector/installed/openmpi-intel/
--with-parmetis-dir=/home/hector/installed/parmetis/
--with-metis-dir=/home/hector/installed/parmetis/
--with-zoltan-dir=/home/hector/installed/zoltan/ --download-ptscotch
--download-hypre
--with-blas-lapack-dir=/share/apps/intel/composer_xe_2011_sp1.9.293/mkl/
--with-valgrind=1 --with-valgrind-dir=/home/hector/installed
--with-shared-libraries=0
[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
==18555==
==18555== HEAP SUMMARY:
==18555==     in use at exit: 52,616,185 bytes in 7,935 blocks
==18555==   total heap usage: 121,248 allocs, 113,313 frees,
69,099,609 bytes allocated
==18555==
==18555== 240 bytes in 6 blocks are definitely lost in loss record
5,950 of 6,383
==18555==    at 0x2ECBEAF0: malloc (vg_replace_malloc.c:296)
==18555==    by 0x324D4798E1: strdup (in /lib64/libc-2.5.so)
==18555==    by 0x331E1A27: opal_show_help_vstring (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x32F30F53: orte_show_help (in
/home/hector/installed/openmpi-intel/lib/libopen-rte.so.7.0.5)
==18555==    by 0x32A5F709: PMPI_Abort (in
/home/hector/installed/openmpi-intel/lib/libmpi.so.1.6.1)
==18555==    by 0x245A9F3: PetscSignalHandlerDefault (signal.c:161)
==18555==    by 0x245A356: PetscSignalHandler_Private (signal.c:49)
==18555==    by 0x324D4302EF: ??? (in /lib64/libc-2.5.so)
==18555==    by 0x20A104E: matsetoption_ (matrixf.c:890)
==18555==    by 0x1BE0576: jpetsc3d_ (xsolver.f:382)
==18555==    by 0x1BDC9A3: xsolver_ (xsolver.f:104)
==18555==    by 0x1AB1BB3: solvers_ (xsolve.f:60)
==18555==
==18555== 336 bytes in 1 blocks are possibly lost in loss record 6,043 of 6,383
==18555==    at 0x2ECC0867: calloc (vg_replace_malloc.c:623)
==18555==    by 0x324D0100A2: _dl_allocate_tls (in /lib64/ld-2.5.so)
==18555==    by 0x324E006EB8: pthread_create@@GLIBC_2.2.5 (in
/lib64/libpthread-2.5.so)
==18555==    by 0x331BABB5: opal_thread_start (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x32F43A18: orte_ess_base_app_setup (in
/home/hector/installed/openmpi-intel/lib/libopen-rte.so.7.0.5)
==18555==    by 0x34C503DE: rte_init (in
/home/hector/installed/openmpi-intel/lib/openmpi/mca_ess_singleton.so)
==18555==    by 0x32F222F5: orte_init (in
/home/hector/installed/openmpi-intel/lib/libopen-rte.so.7.0.5)
==18555==    by 0x32A4D67E: ompi_mpi_init (in
/home/hector/installed/openmpi-intel/lib/libmpi.so.1.6.1)
==18555==    by 0x32A7052E: PMPI_Init (in
/home/hector/installed/openmpi-intel/lib/libmpi.so.1.6.1)
==18555==    by 0x3122CB27: PMPI_INIT (in
/home/hector/installed/openmpi-intel/lib/libmpi_mpifh.so.2.5.0)
==18555==    by 0x1C89281: setprcs_ (man.f:144)
==18555==    by 0x1C4CB98: MAIN__ (main.f:114)
==18555==
==18555== 7,550 bytes in 111 blocks are definitely lost in loss record
6,243 of 6,383
==18555==    at 0x2ECC0A3D: realloc (vg_replace_malloc.c:692)
==18555==    by 0x324D469725: vasprintf (in /lib64/libc-2.5.so)
==18555==    by 0x324D44D727: asprintf (in /lib64/libc-2.5.so)
==18555==    by 0x331E913D: dlopen_foreachfile (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x331CCBC0: mca_base_component_find (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x331D5EC0: mca_base_framework_components_register (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x331D62DE: mca_base_framework_register (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x331D6331: mca_base_framework_open (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x331B71E8: opal_init (in
/home/hector/installed/openmpi-intel/lib/libopen-pal.so.6.2.1)
==18555==    by 0x32F221C4: orte_init (in
/home/hector/installed/openmpi-intel/lib/libopen-rte.so.7.0.5)
==18555==    by 0x32A4D67E: ompi_mpi_init (in
/home/hector/installed/openmpi-intel/lib/libmpi.so.1.6.1)
==18555==    by 0x32A7052E: PMPI_Init (in
/home/hector/installed/openmpi-intel/lib/libmpi.so.1.6.1)
==18555==
==18555== LEAK SUMMARY:
==18555==    definitely lost: 7,790 bytes in 117 blocks
==18555==    indirectly lost: 0 bytes in 0 blocks
==18555==      possibly lost: 336 bytes in 1 blocks
==18555==    still reachable: 52,608,059 bytes in 7,817 blocks
==18555==         suppressed: 0 bytes in 0 blocks
==18555== Reachable blocks (those to which a pointer was found) are not shown.
==18555== To see them, rerun with: --leak-check=full --show-leak-kinds=all
==18555==
==18555== For counts of detected and suppressed errors, rerun with: -v
==18555== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 4 from 4)

Hector

On Fri, Jun 19, 2015 at 2:29 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
>
> PETSc thinks there is an error in user code. Its best if you run it in
> a debugger and check whats going on. [on linux one can also use
> valgrind to track down problems]
>
> There are many changes from petsc3.1 to petsc-3.6. Make sure
> you go throgh the changes files. On C - one can rely on the compiler
> prototype checks to catch most of the changes. But with fortran - you
> might have to rely on a debugger/valgrind to [and the changes docs]
> to find the changes.
>
> http://www.mcs.anl.gov/petsc/documentation/changes/index.html
>
> Satish
>
> On Fri, 19 Jun 2015, Hector E Barrios Molano wrote:
>
>> Hi Satish,
>>
>> I used MatSetOption, I'm ignoring the performance penalty for now.
>>
>> Now I have this error:
>>
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range
>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>> [0]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
>> OS X to find memory corruption errors
>> [0]PETSC ERROR: likely location of problem given in stack below
>> [0]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------
>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
>> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
>> [0]PETSC ERROR:       is given.
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [0]PETSC ERROR: Signal received
>> [0]PETSC ERROR: See
>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> shooting.
>> [0]PETSC ERROR: Petsc Development GIT revision: v3.6-34-gc9ff9ea  GIT
>> Date: 2015-06-17 23:52:21 -0500
>> [0]PETSC ERROR: F:\prog_win\prog\Debug\prog.exe on a windows-intel
>> named HECTOR-PC by hector Fri Jun 19 14:06:17 2015
>> [0]PETSC ERROR: Configure options --with-cc="win32fe cl"
>> --with-fc="win32fe ifort" --prefix=/cygdrive/c/Installed/petsc/
>> --PETSC_ARCH=windows-intel
>> --with-parmetis-include=/cygdrive/c/Installed/parmetis/include
>> --with-parmetis-lib=/cygdrive/c/Installed/parmetis/lib/parmetis.lib
>> --with-metis-include=/cygdrive/c/Installed/parmetis/include
>> --with-metis-lib=/cygdrive/c/Installed/parmetis/lib/metis.lib
>> --with-mpi-include="[/cygdrive/c/Installed/msmpi/Include,/cygdrive/c/Installed/msmpi/Include/x64]"
>> --with-mpi-lib="[/cygdrive/c/Installed/msmpi/Lib/x64/msmpi.lib,/cygdrive/c/Installed/msmpi/Lib/x64/msmpifec.lib]"
>> --with-blas-lapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_intel_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_core.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_sequential.lib]"
>> --with-scalapack-include=/cygdrive/c/Installed/mkl/include
>> --with-scalapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_scalapack_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_blacs_msmpi_lp64.lib]"
>> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
>>
>> job aborted:
>> [ranks] message
>>
>> [0] application aborted
>> aborting MPI_COMM_WORLD (comm=0x44000000), error 59, comm rank 0
>>
>> On Fri, Jun 19, 2015 at 12:57 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>> > As the message says:
>> >
>> >> [0]PETSC ERROR: New nonzero at (0,20) caused a malloc
>> >> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check
>> >
>> > Note: mallocs make the code slow - so its up to you to do a proper
>> > preallocation of memory - or disable the check as the above message
>> > indicadtes [and ignore the performance penalty for now.].
>> >
>> > Satish
>> >
>> > On Fri, 19 Jun 2015, Hector E Barrios Molano wrote:
>> >
>> >> Hi PETSc Experts,
>> >>
>> >> I'm compiling a Windows program using petsc-dev, this program
>> >> previously used petsc-3.1 the only change performed was:
>> >>
>> >> MatCreateMPIBAIJ --> MatCreateBAIJ
>> >>
>> >> As the PETSc documentation suggested.
>> >>
>> >> I only changed the name of the function.
>> >>
>> >> I got to a problem when the program is launched and PETSc is called by
>> >> the program:
>> >>
>> >> [0]PETSC ERROR: --------------------- Error Message
>> >> --------------------------------------------------------------
>> >> [0]PETSC ERROR: Argument out of range
>> >> [0]PETSC ERROR: New nonzero at (0,20) caused a malloc
>> >> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
>> >> turn off this check
>> >> [0]PETSC ERROR: See
>> >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> >> shooting.
>> >> [0]PETSC ERROR: Petsc Development GIT revision: v3.6-34-gc9ff9ea  GIT
>> >> Date: 2015-06-17 23:52:21 -0500
>> >> [0]PETSC ERROR: F:\prog_win\prog\Debug\prog.exe on a windows-intel
>> >> named HECTOR-PC by hector Thu Jun 18 18:31:42 2015
>> >> [0]PETSC ERROR: Configure options --with-cc="win32fe cl"
>> >> --with-fc="win32fe ifort" --prefix=/cygdrive/c/Installed/petsc/
>> >> --PETSC_ARCH=windows-intel
>> >> --with-parmetis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-parmetis-lib=/cygdrive/c/Installed/parmetis/lib/parmetis.lib
>> >> --with-metis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-metis-lib=/cygdrive/c/Installed/parmetis/lib/metis.lib
>> >> --with-mpi-include="[/cygdrive/c/Installed/msmpi/Include,/cygdrive/c/Installed/msmpi/Include/x64]"
>> >> --with-mpi-lib="[/cygdrive/c/Installed/msmpi/Lib/x64/msmpi.lib,/cygdrive/c/Installed/msmpi/Lib/x64/msmpifec.lib]"
>> >> --with-blas-lapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_intel_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_core.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_sequential.lib]"
>> >> --with-scalapack-include=/cygdrive/c/Installed/mkl/include
>> >> --with-scalapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_scalapack_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_blacs_msmpi_lp64.lib]"
>> >> [0]PETSC ERROR: #1 MatSetValues_SeqBAIJ() line 2192 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: #2 MatSetValues() line 1175 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: --------------------- Error Message
>> >> --------------------------------------------------------------
>> >> [0]PETSC ERROR: Argument out of range
>> >> [0]PETSC ERROR: New nonzero at (1,21) caused a malloc
>> >> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
>> >> turn off this check
>> >> [0]PETSC ERROR: See
>> >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> >> shooting.
>> >> [0]PETSC ERROR: Petsc Development GIT revision: v3.6-34-gc9ff9ea  GIT
>> >> Date: 2015-06-17 23:52:21 -0500
>> >> [0]PETSC ERROR: F:\prog_win\prog\Debug\prog.exe on a windows-intel
>> >> named HECTOR-PC by hector Thu Jun 18 18:31:42 2015
>> >> [0]PETSC ERROR: Configure options --with-cc="win32fe cl"
>> >> --with-fc="win32fe ifort" --prefix=/cygdrive/c/Installed/petsc/
>> >> --PETSC_ARCH=windows-intel
>> >> --with-parmetis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-parmetis-lib=/cygdrive/c/Installed/parmetis/lib/parmetis.lib
>> >> --with-metis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-metis-lib=/cygdrive/c/Installed/parmetis/lib/metis.lib
>> >> --with-mpi-include="[/cygdrive/c/Installed/msmpi/Include,/cygdrive/c/Installed/msmpi/Include/x64]"
>> >> --with-mpi-lib="[/cygdrive/c/Installed/msmpi/Lib/x64/msmpi.lib,/cygdrive/c/Installed/msmpi/Lib/x64/msmpifec.lib]"
>> >> --with-blas-lapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_intel_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_core.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_sequential.lib]"
>> >> --with-scalapack-include=/cygdrive/c/Installed/mkl/include
>> >> --with-scalapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_scalapack_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_blacs_msmpi_lp64.lib]"
>> >> [0]PETSC ERROR: #3 MatSetValues_SeqBAIJ() line 2192 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: #4 MatSetValues() line 1175 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> .
>> >> .
>> >> .
>> >> [0]PETSC ERROR: #3599 MatSetValues_SeqBAIJ() line 2192 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: #3600 MatSetValues() line 1175 in
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR:
>> >> ------------------------------------------------------------------------
>> >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> >> probably memory access out of range
>> >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>> >> [0]PETSC ERROR: or see
>> >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>> >> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
>> >> OS X to find memory corruption errors
>> >> [0]PETSC ERROR: likely location of problem given in stack below
>> >> [0]PETSC ERROR: ---------------------  Stack Frames
>> >> ------------------------------------
>> >> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
>> >> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
>> >> [0]PETSC ERROR:       is given.
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: [0] MatSetValues_SeqBAIJ line 2148
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\impls\baij\seq\baij.c
>> >> [0]PETSC ERROR: [0] MatSetValues line 1142
>> >> C:\INSTAL~1\PETSC-~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>> >> [0]PETSC ERROR: --------------------- Error Message
>> >> --------------------------------------------------------------
>> >> [0]PETSC ERROR: Signal received
>> >> [0]PETSC ERROR: See
>> >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> >> shooting.
>> >> [0]PETSC ERROR: Petsc Development GIT revision: v3.6-34-gc9ff9ea  GIT
>> >> Date: 2015-06-17 23:52:21 -0500
>> >> [0]PETSC ERROR: F:\prog_win\prog\Debug\prog.exe on a windows-intel
>> >> named HECTOR-PC by hector Fri Jun 19 12:43:06 2015
>> >> [0]PETSC ERROR: Configure options --with-cc="win32fe cl"
>> >> --with-fc="win32fe ifort" --prefix=/cygdrive/c/Installed/petsc/
>> >> --PETSC_ARCH=windows-intel
>> >> --with-parmetis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-parmetis-lib=/cygdrive/c/Installed/parmetis/lib/parmetis.lib
>> >> --with-metis-include=/cygdrive/c/Installed/parmetis/include
>> >> --with-metis-lib=/cygdrive/c/Installed/parmetis/lib/metis.lib
>> >> --with-mpi-include="[/cygdrive/c/Installed/msmpi/Include,/cygdrive/c/Installed/msmpi/Include/x64]"
>> >> --with-mpi-lib="[/cygdrive/c/Installed/msmpi/Lib/x64/msmpi.lib,/cygdrive/c/Installed/msmpi/Lib/x64/msmpifec.lib]"
>> >> --with-blas-lapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_intel_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_core.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_sequential.lib]"
>> >> --with-scalapack-include=/cygdrive/c/Installed/mkl/include
>> >> --with-scalapack-lib="[/cygdrive/c/Installed/mkl/lib/intel64/mkl_scalapack_lp64.lib,/cygdrive/c/Installed/mkl/lib/intel64/mkl_blacs_msmpi_lp64.lib]"
>> >> [0]PETSC ERROR: #3601 User provided function() line 0 in  unknown file
>> >>
>> >> job aborted:
>> >> [ranks] message
>> >>
>> >> [0] application aborted
>> >> aborting MPI_COMM_WORLD (comm=0x44000000), error 59, comm rank 0
>> >>
>> >>
>> >> I'm compiling the program on Visual Studio, what could be the problem?
>> >>
>> >> Thanks for your help,
>> >>
>> >> Hector
>> >>
>> >
>>
>



More information about the petsc-dev mailing list