[petsc-dev] does petsc filter out zeros in MatSetValues?

Xuefei Yuan (Rebecca) xyuan at lbl.gov
Thu Jan 26 23:05:05 CST 2012


Hello Mark,

Actually I have tried those options  for a sequential run where I need to use superlu to get the condition number of some matrix.

In the dev version,

ierr = DMCreateMatrix(DMMGGetDM(dmmg), MATAIJ, &jacobian);CHKERRQ(ierr);
ierr = MatSetOption(jacobian, MAT_IGNORE_ZERO_ENTRIES, PETSC_TRUE);CHKERRQ(ierr);

And in the options file, 

-dm_preallocate_only

is added.

This is totally fine when np=1, however, when I use multiple processors, there are some memory corruption happened.

For example, the number of true nonzeros for 65536 size matrix is 1,470,802. the output (&) is for np=1 with the following PETSc related options:

-dm_preallocate_only
-snes_ksp_ew true
-snes_monitor
-snes_max_it 1
-ksp_view
-mat_view_info
-ksp_type preonly
-pc_type lu
-pc_factor_mat_solver_package superlu
-mat_superlu_conditionnumber
-mat_superlu_printstat


However, when np=2, the number of nonzeros changes to 3,366,976 with the following PETSc related options. (*) is the output file.

-dm_preallocate_only
-snes_ksp_ew true
-snes_monitor
-ksp_view
-mat_view_info
-ksp_type preonly
-pc_type lu
-pc_factor_mat_solver_package superlu_dist

-----------------------------
(&)

*************petsc-Dev = yes*****************
*********************************************
******* start solving for time = 1.00000 at time step = 1******
  0 SNES Function norm 1.242539468950e-02
Matrix Object: 1 MPI processes
  type: seqaij
  rows=65536, cols=65536
  total: nonzeros=1470802, allocated nonzeros=2334720
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines
  Recip. condition number = 4.345658e-07
MatLUFactorNumeric_SuperLU():
Factor time  =    42.45
Factor flops = 7.374620e+10     Mflops =  1737.25
Solve time   =     0.00
Number of memory expansions: 3
  No of nonzeros in factor L = 32491856
  No of nonzeros in factor U = 39390974
  No of nonzeros in L+U = 71817294
  L\U MB 741.397        total MB needed 756.339
Matrix Object: 1 MPI processes
  type: seqaij
  rows=65536, cols=65536
  package used to perform factorization: superlu
  total: nonzeros=0, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls =0
    SuperLU run parameters:
      Equil: NO
      ColPerm: 3
      IterRefine: 0
      SymmetricMode: NO
      DiagPivotThresh: 1
      PivotGrowth: NO
      ConditionNumber: YES
      RowPerm: 0
      ReplaceTinyPivot: NO
      PrintStat: YES
      lwork: 0
MatSolve__SuperLU():
Factor time  =    42.45
Factor flops = 7.374620e+10     Mflops =  1737.25
Solve time   =     0.59
Solve flops = 1.436365e+08      Mflops =   243.45
Number of memory expansions: 3
 1 SNES Function norm 2.645145585949e-04


-----------------------------------------------
(*)
*************petsc-Dev = yes*****************
*********************************************
******* start solving for time = 1.00000 at time step = 1******
  0 SNES Function norm 1.242539468950e-02
Matrix Object: 2 MPI processes
  type: mpiaij
  rows=65536, cols=65536
  total: nonzeros=3366976, allocated nonzeros=6431296
  total number of mallocs used during MatSetValues calls =0
    Matrix Object:     2 MPI processes
      type: mpiaij
      rows=65536, cols=65536
      total: nonzeros=3366976, allocated nonzeros=3366976
      total number of mallocs used during MatSetValues calls =0
        using I-node (on process 0) routines: found 8192 nodes, limit used is 5
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
At column 0, pivotL() encounters zero diagonal at line 708 in file symbfact.c
At column 0, pivotL() encounters zero diagonal at line 708 in file symbfact.c

Moreover, When I use valgrind with --leak-check=yes --track-origins=yes, there are 441 errors from 219 contexts in PetscInitialize() before calling SNESSolve(). Is this normal for dev?

Thanks very much!

Best regards,

Rebecca






On Jan 26, 2012, at 5:06 PM, Jed Brown wrote:

> On Thu, Jan 26, 2012 at 19:00, Mark F. Adams <mark.adams at columbia.edu> wrote:
> I'm guessing that PETSc recently changed and now filters out 0.0 in MatSetValues ... is this true?
> 
> Did the option MAT_IGNORE_ZERO_ENTRIES get set somehow?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120126/e7d7fa7b/attachment.html>


More information about the petsc-dev mailing list