[petsc-users] [petsc-dev] does petsc filter out zeros in MatSetValues?
Xuefei (Rebecca) Yuan
xyuan at lbl.gov
Thu Jan 26 23:13:57 CST 2012
Here is another error message if running on local mac:
*************petsc-Dev = yes*****************
*********************************************
******* start solving for time = 0.10000 at time step = 1******
******* start solving for time = 0.10000 at time step = 1******
0 SNES Function norm 2.452320964164e-02
0 SNES Function norm 2.452320964164e-02
Matrix Object: 1 MPI processes
type: seqaij
rows=16384, cols=16384
total: nonzeros=831552, allocated nonzeros=1577536
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 4096 nodes, limit used is 5
Matrix Object: 1 MPI processes
type: seqaij
rows=16384, cols=16384
total: nonzeros=831552, allocated nonzeros=1577536
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 4096 nodes, limit used is 5
Runtime parameters:
Objective type: Unknown!
Coarsening type: Unknown!
Initial partitioning type: Unknown!
Refinement type: Unknown!
Number of balancing constraints: 1
Number of refinement iterations: 1606408608
Random number seed: 1606408644
Number of separators: 48992256
Compress graph prior to ordering: Yes
Detect & order connected components separately: Yes
Prunning factor for high degree vertices: 0.100000
Allowed maximum load imbalance: 1.001
Input Error: Incorrect objective type.
nbrpool statistics
nbrpoolsize: 0 nbrpoolcpos: 0
nbrpoolreallocs: 0
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] MatLUFactorNumeric_SuperLU_DIST line 284 /Users/xyuan/Software_macbook/petsc-dev/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c
[0]PETSC ERROR: [0] MatLUFactorNumeric line 2871 /Users/xyuan/Software_macbook/petsc-dev/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCSetUp_LU line 108 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: [0] PCSetUp line 810 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 184 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 334 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] SNES_KSPSolve line 3874 /Users/xyuan/Software_macbook/petsc-dev/src/snes/interface/snes.c
[0]PETSC ERROR: [0] SNESSolve_LS line 593 /Users/xyuan/Software_macbook/petsc-dev/src/snes/impls/ls/ls.c
[0]PETSC ERROR: [0] SNESSolve line 3061 /Users/xyuan/Software_macbook/petsc-dev/src/snes/interface/snes.c
[0]PETSC ERROR: [0] DMMGSolveSNES line 538 /Users/xyuan/Software_macbook/petsc-dev/src/snes/utils/damgsnes.c
[0]PETSC ERROR: [0] DMMGSolve line 303 /Users/xyuan/Software_macbook/petsc-dev/src/snes/utils/damg.c
[0]PETSC ERROR: [0] Solve line 374 twcartffxmhd.c
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development HG revision: 905af3a7d7cdee7d0b744502bace1d74dc34b204 HG Date: Sun Jan 22 16:10:04 2012 -0700
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./twcartffxmhd.exe on a arch-osx- named DOE6897708.local by xyuan Thu Jan 26 21:09:47 2012
[0]PETSC ERROR: Libraries linked from /Users/xyuan/Software_macbook/petsc-dev/arch-osx-10.6-c-pkgs-opt-debug/lib
[0]PETSC ERROR: Configure run at Mon Jan 23 10:21:17 2012
[0]PETSC ERROR: Configure options --with-cc="gcc -m64" --with-fc="gfortran -m64" --with-cxx=g++ --with-debugging=1 -download-f-blas-lapack=1 --download-mpich=1 --download-plapack=1 --download-parmetis=1 --download-metis=1 --download-triangle=1 --download-spooles=1 --download-superlu=1 --download-superlu_dist=/Users/xyuan/Software_macbook/superlu_dist_3.0.tar.gz --download-blacs=1 --download-scalapack=1 --download-mumps=1 --download-hdf5=1 --download-sundials=1 --download-prometheus=1 --download-umfpack=1 --download-chaco=1 --download-spai=1 --download-ptscotch=1 --download-pastix=1 --download-prometheus=1 --download-cmake=1 PETSC_ARCH=arch-osx-10.6-c-pkgs-opt-debug
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
Runtime parameters:
Objective type: Unknown!
Coarsening type: Unknown!
Initial partitioning type: Unknown!
Refinement type: Unknown!
Number of balancing constraints: 1
Number of refinement iterations: 1606408608
Random number seed: 1606408644
Number of separators: 48992256
Compress graph prior to ordering: Yes
Detect & order connected components separately: Yes
Prunning factor for high degree vertices: 0.100000
Allowed maximum load imbalance: 1.001
Input Error: Incorrect objective type.
nbrpool statistics
nbrpoolsize: 0 nbrpoolcpos: 0
nbrpoolreallocs: 0
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] MatLUFactorNumeric_SuperLU_DIST line 284 /Users/xyuan/Software_macbook/petsc-dev/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c
[0]PETSC ERROR: [0] MatLUFactorNumeric line 2871 /Users/xyuan/Software_macbook/petsc-dev/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCSetUp_LU line 108 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: [0] PCSetUp line 810 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 184 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 334 /Users/xyuan/Software_macbook/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] SNES_KSPSolve line 3874 /Users/xyuan/Software_macbook/petsc-dev/src/snes/interface/snes.c
[0]PETSC ERROR: [0] SNESSolve_LS line 593 /Users/xyuan/Software_macbook/petsc-dev/src/snes/impls/ls/ls.c
[0]PETSC ERROR: [0] SNESSolve line 3061 /Users/xyuan/Software_macbook/petsc-dev/src/snes/interface/snes.c
[0]PETSC ERROR: [0] DMMGSolveSNES line 538 /Users/xyuan/Software_macbook/petsc-dev/src/snes/utils/damgsnes.c
[0]PETSC ERROR: [0] DMMGSolve line 303 /Users/xyuan/Software_macbook/petsc-dev/src/snes/utils/damg.c
[0]PETSC ERROR: [0] Solve line 374 twcartffxmhd.c
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development HG revision: 905af3a7d7cdee7d0b744502bace1d74dc34b204 HG Date: Sun Jan 22 16:10:04 2012 -0700
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./twcartffxmhd.exe on a arch-osx- named DOE6897708.local by xyuan Thu Jan 26 21:09:47 2012
[0]PETSC ERROR: Libraries linked from /Users/xyuan/Software_macbook/petsc-dev/arch-osx-10.6-c-pkgs-opt-debug/lib
[0]PETSC ERROR: Configure run at Mon Jan 23 10:21:17 2012
[0]PETSC ERROR: Configure options --with-cc="gcc -m64" --with-fc="gfortran -m64" --with-cxx=g++ --with-debugging=1 -download-f-blas-lapack=1 --download-mpich=1 --download-plapack=1 --download-parmetis=1 --download-metis=1 --download-triangle=1 --download-spooles=1 --download-superlu=1 --download-superlu_dist=/Users/xyuan/Software_macbook/superlu_dist_3.0.tar.gz --download-blacs=1 --download-scalapack=1 --download-mumps=1 --download-hdf5=1 --download-sundials=1 --download-prometheus=1 --download-umfpack=1 --download-chaco=1 --download-spai=1 --download-ptscotch=1 --download-pastix=1 --download-prometheus=1 --download-cmake=1 PETSC_ARCH=arch-osx-10.6-c-pkgs-opt-debug
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
R
On Jan 26, 2012, at 9:05 PM, Xuefei (Rebecca) Yuan wrote:
> Hello Mark,
>
> Actually I have tried those options for a sequential run where I need to use superlu to get the condition number of some matrix.
>
> In the dev version,
>
> ierr = DMCreateMatrix(DMMGGetDM(dmmg), MATAIJ, &jacobian);CHKERRQ(ierr);
> ierr = MatSetOption(jacobian, MAT_IGNORE_ZERO_ENTRIES, PETSC_TRUE);CHKERRQ(ierr);
>
> And in the options file,
>
> -dm_preallocate_only
>
> is added.
>
> This is totally fine when np=1, however, when I use multiple processors, there are some memory corruption happened.
>
> For example, the number of true nonzeros for 65536 size matrix is 1,470,802. the output (&) is for np=1 with the following PETSc related options:
>
> -dm_preallocate_only
> -snes_ksp_ew true
> -snes_monitor
> -snes_max_it 1
> -ksp_view
> -mat_view_info
> -ksp_type preonly
> -pc_type lu
> -pc_factor_mat_solver_package superlu
> -mat_superlu_conditionnumber
> -mat_superlu_printstat
>
>
> However, when np=2, the number of nonzeros changes to 3,366,976 with the following PETSc related options. (*) is the output file.
>
> -dm_preallocate_only
> -snes_ksp_ew true
> -snes_monitor
> -ksp_view
> -mat_view_info
> -ksp_type preonly
> -pc_type lu
> -pc_factor_mat_solver_package superlu_dist
>
> -----------------------------
> (&)
>
> *************petsc-Dev = yes*****************
> *********************************************
> ******* start solving for time = 1.00000 at time step = 1******
> 0 SNES Function norm 1.242539468950e-02
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=65536, cols=65536
> total: nonzeros=1470802, allocated nonzeros=2334720
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> Recip. condition number = 4.345658e-07
> MatLUFactorNumeric_SuperLU():
> Factor time = 42.45
> Factor flops = 7.374620e+10 Mflops = 1737.25
> Solve time = 0.00
> Number of memory expansions: 3
> No of nonzeros in factor L = 32491856
> No of nonzeros in factor U = 39390974
> No of nonzeros in L+U = 71817294
> L\U MB 741.397 total MB needed 756.339
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=65536, cols=65536
> package used to perform factorization: superlu
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> SuperLU run parameters:
> Equil: NO
> ColPerm: 3
> IterRefine: 0
> SymmetricMode: NO
> DiagPivotThresh: 1
> PivotGrowth: NO
> ConditionNumber: YES
> RowPerm: 0
> ReplaceTinyPivot: NO
> PrintStat: YES
> lwork: 0
> MatSolve__SuperLU():
> Factor time = 42.45
> Factor flops = 7.374620e+10 Mflops = 1737.25
> Solve time = 0.59
> Solve flops = 1.436365e+08 Mflops = 243.45
> Number of memory expansions: 3
> 1 SNES Function norm 2.645145585949e-04
>
>
> -----------------------------------------------
> (*)
> *************petsc-Dev = yes*****************
> *********************************************
> ******* start solving for time = 1.00000 at time step = 1******
> 0 SNES Function norm 1.242539468950e-02
> Matrix Object: 2 MPI processes
> type: mpiaij
> rows=65536, cols=65536
> total: nonzeros=3366976, allocated nonzeros=6431296
> total number of mallocs used during MatSetValues calls =0
> Matrix Object: 2 MPI processes
> type: mpiaij
> rows=65536, cols=65536
> total: nonzeros=3366976, allocated nonzeros=3366976
> total number of mallocs used during MatSetValues calls =0
> using I-node (on process 0) routines: found 8192 nodes, limit used is 5
> Input Error: Incorrect objective type.
> Input Error: Incorrect objective type.
> At column 0, pivotL() encounters zero diagonal at line 708 in file symbfact.c
> At column 0, pivotL() encounters zero diagonal at line 708 in file symbfact.c
>
> Moreover, When I use valgrind with --leak-check=yes --track-origins=yes, there are 441 errors from 219 contexts in PetscInitialize() before calling SNESSolve(). Is this normal for dev?
>
> Thanks very much!
>
> Best regards,
>
> Rebecca
>
>
>
>
>
>
> On Jan 26, 2012, at 5:06 PM, Jed Brown wrote:
>
>> On Thu, Jan 26, 2012 at 19:00, Mark F. Adams <mark.adams at columbia.edu> wrote:
>> I'm guessing that PETSc recently changed and now filters out 0.0 in MatSetValues ... is this true?
>>
>> Did the option MAT_IGNORE_ZERO_ENTRIES get set somehow?
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120126/e6ef814f/attachment-0001.htm>
More information about the petsc-users
mailing list