SBAIJ issue

Hong Zhang hzhang at mcs.anl.gov
Mon Oct 12 11:32:04 CDT 2009


Ando,

I do not see any error message from attached info below.
Even '-log_summary' gives correct display.
I guess you sent us the working output (np=2).

I would suggest you run your code with debugger,
e.g., '-start_in_debugger'.
When it hangs, type Control-C,
and type 'where' to check where it hangs.

Hong



On Oct 12, 2009, at 8:03 AM, Andreas Grassl wrote:

> Hello,
>
> I'm trying to work with MUMPS-Cholesky as direct solver and CG as  
> iterative
> solver and have problems with SBAIJ-Matrices.
>
> I have a MatSetValues-line not regarding the symmetry but force it  
> by the
> command line option -mat_ignore_lower_triangular.
>
> The problems show up at "bigger" problems (~450000 DOF) with "more"  
> processes (1
> and 2 work fine, 4 gives a hangup) and/or trying to save the matrix.  
> The machine
> has 4 cores and 12 GB memory. NNZ per row is ~80.
>
> Attached you find the output of the program run with -info.
>
> Any hints where to search?
>
> Cheers,
>
> ando
>
> -- 
> /"\                               Grassl Andreas
> \ /    ASCII Ribbon Campaign      Uni Innsbruck Institut f. Mathematik
>  X      against HTML email        Technikerstr. 13 Zi 709
> / \                               +43 (0)512 507 6091
> same call as 2procinfo without -ksp_info_binary
>
> mpiexec -np 2 externalsolver
> [0] PetscInitialize(): PETSc successfully started: number of  
> processors = 2
> [1] PetscInitialize(): PETSc successfully started: number of  
> processors = 2
> [1] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [0] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetOption_MPISBAIJ(): Option SYMMETRIC ignored
> [0] MatStashScatterBegin_Private(): No of messages: 1
> [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 30544064
> [0] MatStashScatterBegin_Private(): No of messages: 0
> [0] MatAssemblyBegin_MPISBAIJ(): Stash has 3818007 entries,uses 8  
> mallocs.
> [0] MatAssemblyBegin_MPISBAIJ(): Block-Stash has 3818007 entries,  
> uses 8 mallocs.
> [0] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 221545 X 221545, block  
> size 1; storage space: 16676676 unneeded, 5477824 used
> [0] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 62
> [1] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 225367 X 225367, block  
> size 1; storage space: 16793278 unneeded, 5743422 used
> [1] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [1] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 65
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483645
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483644
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483643
> [1] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483645
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483644
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483643
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate():   returning tag 2147483630
> [1] PetscCommDuplicate():   returning tag 2147483625
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483642
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483641
> [1] PetscCommDuplicate():   returning tag 2147483620
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [0] PetscCommDuplicate():   returning tag 2147483630
> [0] PetscCommDuplicate():   returning tag 2147483625
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483642
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483641
> [0] PetscCommDuplicate():   returning tag 2147483620
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to MPI
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483640
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDupli[1] PetscCommDuplicate(): Using internal PETSc  
> communicator 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483640
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483639
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483638
> [1] MatAssemblyEnd_SeqBAIJ(): Matrix size: 225367 X 0, block size 1;  
> storage space: 22536700 unneeded, 0 used
> [1] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [1] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 0
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 225367)/ 
> (num_localrows 225367) > 0.6. Use CompressedRow routines.
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483614
> [1] PetscCommDuplicate():   returning tag 2147483613
> [1]before KSPsetup
> [1] PetscCommDuplicate(): Using cate():   returning tag 2147483638
> [0] MatAssemblyEnd_SeqBAIJ(): Matrix size: 221545 X 6275, block size  
> 1; storage space: 22060506 unneeded, 93994 used
> [0] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 32
> [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 216534)/ 
> (num_localrows 221545) > 0.6. Use CompressedRow routines.
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483614
> [0] PetscCommDuplicate():   returning tag 2147483613
> [0]before KSPsetup
> [0] PCSetUp(): Setting up new PC
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483637
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483636
> [0] PetscCommDuplicate():   returning tag 2147483612
> [0] PetscCommDuplicate(): Using internainternal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483637
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483636
> [1] PetscCommDuplicate():   returning tag 2147483612
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483635
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483634
> l PETSc communicator 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483635
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483633
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483632
> [1] PetscCommDuplicate():   returning tag 2147483607
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483633
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483632
> [0] PetscCommDuplicate():   returning tag 2147483607
> [0] PetscCommDuplicate():   returning tag 2147483602
> [0] VecScatterCreate(): Special case: processor zero gets entire  
> parallel vector, rest get none
> [1] PetscCommDuplicate():   returning tag 2147483602
>
> DMUMPS 4.8.4
> L D L^T Solver for symmetric positive definite matrices
> Type of parallelism: Working host
>
> ****** ANALYSIS STEP ********
>
> Density: NBdense, Average, Median   =    0   49   50
> Ordering based on METIS
> ** Peak of sequential stack size (number of real entries)   :  
> 16763905.
> A root of estimated size         2965  has been selected for  
> Scalapack.
>
> Leaving analysis phase with  ...
> INFOG(1)                                       =               0
> INFOG(2)                                       =               0
> -- (20) Number of entries in factors (estim.) =       120657071
> --  (3) Storage of factors  (REAL, estimated) =       137364268
> --  (4) Storage of factors  (INT , estimated) =         6138804
> --  (5) Maximum frontal size      (estimated) =            3705
> --  (6) Number of nodes in the tree           =           21863
> --  (7) Ordering option effectively used      =               5
> ICNTL(6) Maximum transversal option            =               0
> ICNTL(7) Pivot order option                    =               7
> Percentage of memory relaxation (effective)    =             200
> Number of level 2 nodes                        =               1
> Number of split nodes                          =               0
> RINFO(1) Operations during elimination (estim) =   1.114D+11
> Distributed matrix entry format (ICNTL(18))    =               3
> ** Rank of proc needing largest memory in IC facto        :         0
> ** Estimated corresponding MBYTES for IC facto            :      2085
> ** Estimated avg. MBYTES per work. proc at facto (IC)     :      2044
> ** TOTAL     space in MBYTES for IC factorization         :      4088
> ** Rank of proc needing largest memory for OOC facto      :         1
> ** Estimated corresponding MBYTES for OOC facto           :       644
> ** Estimated avg. MBYTES per work. proc at facto (OOC)    :       630
> ** TOTAL     space in MBYTES for OOC factorization        :      1261
>
> ****** FACTORIZATION STEP ********
>
>
> GLOBAL STATISTICS PRIOR NUMERICAL FACTORIZATION ...
> NUMBER OF WORKING PROCESSES          =           2
> REAL SPACE FOR FACTORS               =   137364268
> INTEGER SPACE FOR FACTORS            =     6138804
> MAXIMUM FRONTAL SIZE (ESTIMATED)     =        3705
> NUMBER OF NODES IN THE TREE          =       21863
> Maximum effective relaxed size of S              =   239060000
> Average effective relaxed size of S              =   234725641
>
> REDISTRIB: TOTAL DATA LOCAL/SENT     =     9086109     2238491
> GLOBAL TIME FOR MATRIX DISTRIBUTION  =      0.7322
> ** Memory relaxation parameter ( ICNTL(14)  )            :       200
> ** Rank of processor needing largest memory in facto     :         0
> ** Space in MBYTES used by this processor for facto      :      2085
> ** Avg. Space in MBYTES per working proc during facto    :      2044
>
> ELAPSED TIME FOR FACTORIZATION       =     28.1724
> Maximum effective space used in S    (KEEP(67)   =    77516871
> Average effective space used in S    (KEEP(67)   =    75395654
> ** EFF Min: Rank of processor needing largest memory :         0
> ** EFF Min: Space in MBYTES used by this processor   :       741
> ** EFF Min: Avg. Space in MBYTES per working proc    :       720
>
> GLOBAL STATISTICS
> RINFOG(2)  OPERATIONS DURING NODE ASSEMBLY     = 2.565D+08
> ------(3)  OPERATIONS DURING NODE ELIMINATION  = 1.114D+11
> INFOG (9)  REAL SPACE FOR FACTORS              =   137364268
> INFOG(10)  INTEGER SPACE FOR FACTORS           =     6138809
> INFOG(11)  MAXIMUM FRONT SIZE                  =        3705
> INFOG(29)  NUMBER OF ENTRIES IN FACTORS        =   116259976
> INFOG(12) NB OF NEGATIVE PIVOTS          =           0
> INFOG(14)  NUMBER OF MEMORY COMPRESS           =           0
> [0]after KSPsetup
> [0]RHS setup
> [1]after KSPsetup
> [1]RHS setup
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483601
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483601
> [0] PetscCommDuplicate():   returning tag 2147483596
> [1] PetscCommDuplicate():   returning tag 2147483596
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483591
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483591
> [0] VecAssemblyBegin_MPI(): Stash has 225367 entries, uses 12 mallocs.
> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483631
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483631
>
>
> ****** SOLVE & CHECK STEP ********
>
>
> STATISTICS PRIOR SOLVE PHASE     ...........
> NUMBER OF RIGHT-HAND-SIDES                    =           1
> BLOCKING FACTOR FOR MULTIPLE RHS              =           1
> ICNTL (9)                                     =           1
>  --- (10)                                     =           0
>  --- (11)                                     =           0
>  --- (20)                                     =           0
>  --- (21)                                     =           1
> ** Rank of processor needing largest memory in solve     :         0
> ** Space in MBYTES used by this processor for solve      :      2013
> ** Avg. Space in MBYTES per working proc during solve    :      1975
>
>
> LEAVING SOLVER WITH:  INFOG(1) ............ =           0
>                       INFOG(2) ............ =           0
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483630
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483629
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483630
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483629
> [0] PetscCommDuplicate():   returning tag 2147483590
> [1] PetscCommDuplicate():   returning tag 2147483590
> [1] VecScatterCreate(): General case: Seq to MPI
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: Seq to MPI
> [0] PetscCommDuplicate():   returning tag 2147483586
> KSP Object:
>  type: preonly
>  maximum iterations=10000, initial guess is zero
>  tolerances:  relative=1e-06, absolute=1e-50, divergence=10000
>  left preconditioning
> PC Object:
>  type: cholesky
>    Cholesky: out-of-place factorization
>      matrix ordering: natural
>    Cholesky: factor fill ratio needed 0
>         Factored matrix follows
>        Matrix Object:
>          type=mpisbaij, rows=446912, cols=446912
>          package used to perform factorization: mumps
> [1] PetscCommDuplicate():   returning tag 2147483586
> [1] PetscCommDuplicate():   returning tag 2147483585
> [1] PetscCommDuplicate():   returning tag 2147483584
> [1] PetscCommDuplicate():   returning tag 2147483583
> [1] PetscCommDuplicate():   returning tag 2147483582
> [1] PetscCommDuplicate():   returning tag 2147483581
> [1] PetscCommDuplicate():   returning tag 2147483580
>          total: nonzeros=0, allocated nonzeros=893824
>            MUMPS run parameters:
>              SYM (matrix type):                  1
>              PAR (host participation):           1
>              ICNTL(1) (output for error):        6
>              ICNTL(2) (output of diagnostic msg):0
>              ICNTL(3) (output for global info):  6
>              ICNTL(4) (level of printing):       -1
>              ICNTL(5) (input mat struct):        0
>              ICNTL(6) (matrix prescaling):       0
>              ICNTL(7) (matrix ordering):         7
>              ICNTL(8) (scalling strategy):       77
>              ICNTL(9) (A/A^T x=b is solved):     1
>              ICNTL(10) (max num of refinements): 0
>              ICNTL(11) (error analysis):         0
>              ICNTL(12) (efficiency control):                         1
>              ICNTL(13) (efficiency control):                         0
>              ICNTL(14) (percentage of estimated workspace increase):  
> 200
>              ICNTL(18)[1] PetscCommDuplicate():   returning tag  
> 2147483579
> (input mat struct):                           3
>              ICNTL(19) (Shur complement info):                       0
>              ICNTL(20) (rhs sparse pattern):                         0
>              ICNTL(21) (solution struct):                            1
>              ICNTL(22) (in-core/out-of-core facility):               0
>              ICNTL(23) (max size of memory can be allocated locally):0
>              ICNTL(24) (detection of null pivot rows):               0
>              ICNTL(25) (computation of a null space basis):          0
>              ICNTL(26) (Schur options for rhs or solution):          0
>              ICNTL(27) (experimental parameter):                      
> -8
>              CNTL(1) (relative pivoting threshold):      0
>              CNTL(2) (stopping criterion of refinement): 1.49012e-08
>              CNTL(3) (absolute pivoting threshold):      0
>              CNTL(4) (value of static pivoting):         -1
>              CNTL(5) (fixation for null pivots):         0
>      RINFO(1) (local estimated flops for the elimination after  
> analysis):
>             [0] 5.16132e+10
> [0] PetscCommDuplicate():   returning tag 2147483585
>             [1] 5.97957e+10
>      RINFO(2) (local estimated flops for the assembly after  
> factorization):
>             [0]  1.33366e+08
> [0] PetscCommDuplicate():   returning tag 2147483584
>             [1]  1.23105e+08
>      RINFO(3) (local estimated flops for the elimination after  
> factorization):
>             [0]  5.16124e+10
> [0] PetscCommDuplicate():   returning tag 2147483583
>             [1]  5.9795e+10
>      INFO(15) (estimated size of (in MB) MUMPS internal data for  
> running numerical factorization):
>             [0] 2085
> [0] PetscCommDuplicate():   returning tag 2147483582
>             [1] 2003
>      INFO(16) (size of (in MB) MUMPS internal data used during  
> numerical factorization):
>             [0] 2085
> [0] PetscCommDuplicate():   returning tag 2147483581
>             [1] 2003
>      INFO(23) (num of pivots eliminated on this processor after  
> factorization):
>             [0] 236610
> [0] PetscCommDuplicate():   returning tag 2147483580
>             [1] 210302
>              RINFOG(1) (global estimated flops for the elimination  
> after analysis): 1.11409e+11
>              RINFOG(2) (global estimated flops for the assembly  
> after factorization): 2.56471e+08
>              RINFOG(3) (global estimated flops for the elimination  
> after factorization): 1.11407e+11
>              INFOG(3) (estimated real workspace for factors on all  
> processors after analysis): 137364268
>              INFOG(4) (estimated integer workspace for factors on  
> all processors after analysis): 6138804
>              INFOG(5) (estimated maximum front size in the complete  
> tree): 3705
>              INFOG(6) (number of nodes in the complete tree): 21863
>              INFOG(7) (ordering option effectively uese after  
> analysis): 5
>              INFOG(8) (structural symmetry in percent of the  
> permuted matrix after analysis): 100
>              INFOG(9) (total real/complex workspace to store the  
> matrix factors after factorization): 137364268
>              INFOG(10) (total integer space store the matrix factors  
> after factorization): 6138809
>              INFOG(11) (order of largest frontal matrix after  
> factorization): 3705
>              INFOG(12) (number of off-diagonal pivots): 0
>              INFOG(13) (number of delayed pivots after  
> factorization): 0
>              INFOG(14) (number of memory compress after  
> factorization): 0
>              INFOG(15) (number of steps of iterative refinement  
> after solution): 0
>              INFOG(16) (estimated size (in MB) of all MUMPS internal  
> data for factorization after analysis: value on the most memory  
> consuming processor): 2085
>              INFOG(17) (estimated size of all MUMPS internal data  
> for factorization after analysis: sum over all processors): 4088
>              INFOG(18) (size of all MUMPS internal data allocated  
> during factorization: value on the most memory consuming processor):  
> 2085
>              INFOG(19) (size of all MUMPS internal data allocated  
> during factorization: sum over all processors): 4088
>              INFOG(20) (estimated number of entries in the factors):  
> 120657071
>              INFOG(21) (size in MB of memory effectively used during  
> factorization - value on the most memory consuming processor): 741
>              INFOG(22) (size in MB of memory effectively used during  
> factorization - sum over all processors): 1441
>              INFOG(23) (after analysis: value of ICNTL(6)  
> effectively used): 0
>              INFOG(24) (after analysis: value of ICNTL(12)  
> effectively used): 1
>              INFOG(25) (after factorization: number of pivots  
> modified by static pivoting): 0
>  linear system matrix = precond matrix:
>  Matrix Object:
>    type=mpisbaij, rows=446912, cols=446912
>    total: nonzeros=11315240, allocated nonzeros=89382400
>        block size is 1
> [0] PetscCommDuplicate():   returning tag 2147483579
> [1] PetscCommDuplicate():   returning tag 2147483578
> [1] PetscCommDuplicate():   returning tag 2147483577
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483578
> [0] PetscCommDuplicate():   returning tag 2147483577
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850689
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [0] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374783
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483576
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850689
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [1] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374783
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850688
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [0] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374784
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> ************************************************************************************************************************
> ***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript - 
> r -fCourier9' to print this document            ***
> ************************************************************************************************************************
>
> ---------------------------------------------- PETSc Performance  
> Summary: ----------------------------------------------
>
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483576
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850688
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [1] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374784
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> externalsolver on a linux32-i named mat1.uibk.ac.at with 2  
> processors, by csae1801 Mon Oct 12 12:08:38 2009
> Using Petsc Release Version 3.0.0, Patch 8, Fri Aug 21 14:02:12 CDT  
> 2009
>
>                         Max       Max/Min        Avg      Total
> Time (sec):           3.989e+01      1.00000   3.989e+01
> Objects:              3.600e+01      1.00000   3.600e+01
> Flops:                0.000e+00      0.00000   0.000e+00  0.000e+00
> Flops/sec:            0.000e+00      0.00000   0.000e+00  0.000e+00
> MPI Messages:         2.550e+01      1.08511   2.450e+01  4.900e+01
> MPI Message Lengths:  6.596e+07      1.00053   2.692e+06  1.319e+08
> MPI Reductions:       5.800e+01      1.00000
>
> Flop counting convention: 1 flop = 1 real number operation of type  
> (multiply/divide/add/subtract)
>                            e.g., VecAXPY() for real vectors of  
> length N --> 2N flops
>                            and VecAXPY() for complex vectors of  
> length N --> 8N flops
>
> Summary of Stages:   ----- Time ------  ----- Flops -----  ---  
> Messages ---  -- Message Lengths --  -- Reductions --
>                        Avg     %Total     Avg     %Total   counts    
> %Total     Avg         %Total   counts   %Total
> 0:      Main Stage: 2.8402e-01   0.7%  0.0000e+00   0.0%  1.000e 
> +00   2.0%  8.163e-02        0.0%  0.000e+00   0.0%
> 1:   decomposition: 4.7451e-01   1.2%  0.0000e+00   0.0%  0.000e 
> +00   0.0%  0.000e+00        0.0%  0.000e+00   0.0%
> 2:   Matrix Create: 7.2766e-01   1.8%  0.0000e+00   0.0%  0.000e 
> +00   0.0%  0.000e+00        0.0%  2.000e+00   3.4%
> 3:     Matrix fill: 2.5720e+00   6.4%  0.0000e+00   0.0%  1.600e+01   
> 32.7%  2.486e+06       92.4%  1.500e+01  25.9%
> 4:    Solver setup: 3.2841e+01  82.3%  0.0000e+00   0.0%  0.000e 
> +00   0.0%  0.000e+00        0.0%  8.000e+00  13.8%
> 5:       RHS setup: 3.2125e-02   0.1%  0.0000e+00   0.0%  4.000e 
> +00   8.2%  1.094e+05        4.1%  4.000e+00   6.9%
> 6:           Solve: 4.9751e-01   1.2%  0.0000e+00   0.0%  2.500e+01   
> 51.0%  5.944e+04        2.2%  6.000e+00  10.3%
> 7:        Postproc: 2.4653e+00   6.2%  0.0000e+00   0.0%  3.000e 
> +00   6.1%  3.679e+04        1.4%  0.000e+00   0.0%
>
> ------------------------------------------------------------------------------------------------------------------------
> See the 'Profiling' chapter of the users' manual for details on  
> interpreting output.
> Phase summary info:
>   Count: number of times phase was executed
>   Time and Flops: Max - maximum over all processors
>                   Ratio - ratio of maximum to minimum over all  
> processors
>   Mess: number of messages sent
>   Avg. len: average message length
>   Reduct: number of global reductions
>   Global: entire computation
>   Stage: stages of a computation. Set stages with  
> PetscLogStagePush() and PetscLogStagePop().
>      %T - percent time in this phase         %F - percent flops in  
> this phase
>      %M - percent messages in this phase     %L - percent message  
> lengths in this phase
>      %R - percent reductions in this phase
>   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max  
> time over all processors)
> ------------------------------------------------------------------------------------------------------------------------
> Event                Count      Time (sec)      
> Flops                             --- Global ---  --- Stage ---    
> Total
>                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg  
> len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
> ------------------------------------------------------------------------------------------------------------------------
>
> --- Event Stage 0: Main Stage
>
> PetscBarrier           1 1.0 3.1269e-0328.4 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
>
> --- Event Stage 1: decomposition
>
>
> --- Event Stage 2: Matrix Create
>
>
> --- Event Stage 3: Matrix fill
>
> MatAssemblyBegin       1 1.0 3.3235e-01 1.9 0.00e+00 0.0 6.0e+00 2.0e 
> +07 3.0e+00  1  0 12 92  5  10  0 38100 20     0
> MatAssemblyEnd         1 1.0 1.0536e+00 1.0 0.00e+00 0.0 1.0e+01 1.8e 
> +04 1.2e+01  3  0 20  0 21  41  0 62  0 80     0
>
> --- Event Stage 4: Solver setup
>
> MatCholFctrSym         1 1.0 4.0531e-06 2.1 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
> MatCholFctrNum         1 1.0 3.2830e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e 
> +00 5.0e+00 82  0  0  0  9 100  0  0  0 62     0
> KSPSetup               1 1.0 6.9141e-06 1.4 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
> PCSetUp                1 1.0 3.2841e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e 
> +00 8.0e+00 82  0  0  0 14 100  0  0  0100     0
>
> --- Event Stage 5: RHS setup
>
> VecAssemblyBegin       1 1.0 6.2160e-03 1.6 0.00e+00 0.0 4.0e+00 1.3e 
> +06 3.0e+00  0  0  8  4  5  16  0100100 75     0
> VecAssemblyEnd         1 1.0 8.4589e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0  26  0  0  0  0     0
>
> --- Event Stage 6: Solve
>
> MatSolve               1 1.0 4.9525e-01 1.0 0.00e+00 0.0 6.0e+00 4.8e 
> +05 4.0e+00  1  0 12  2  7 100  0 24 98 67     0
> MatView                2 1.0 1.4269e-03 1.0 0.00e+00 0.0 1.8e+01 2.7e 
> +03 2.0e+00  0  0 37  0  3   0  0 72  2 33     0
> VecSet                 1 1.0 8.1515e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
> VecScatterBegin        2 1.0 6.7840e-03 1.3 0.00e+00 0.0 2.0e+00 1.3e 
> +06 0.0e+00  0  0  4  2  0   1  0  8 86  0     0
> VecScatterEnd          1 1.0 2.1610e-03 1.7 0.00e+00 0.0 0.0e+00 0.0e 
> +00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
> KSPSolve               1 1.0 4.9603e-01 1.0 0.00e+00 0.0 6.0e+00 4.8e 
> +05 4.0e+00  1  0 12  2  7 100  0 24 98 67     0
> PCApply                1 1.0 4.9526e-01 1.0 0.00e+00 0.0 6.0e+00 4.8e 
> +05 4.0e+00  1  0 12  2  7 100  0 24 98 67     0
>
> --- Event Stage 7: Postproc
>
> VecView                1 1.0 2.1809e+00 2.0 0.00e+00 0.0 2.0e+00 9.0e 
> +05 0.0e+00  4  0  4  1  0  66  0 67100  0     0
> ------------------------------------------------------------------------------------------------------------------------
>
> Memory usage is given in bytes:
>
> Object Type          Creations   Destructions   Memory  Descendants'  
> Mem.
>
> --- Event Stage 0: Main Stage
>
>   IS L to G Mapping     0              1     889756     0
>              Matrix     0              3  547690752     0
>                 Vec     0              6    3702888     0
>         Vec Scatter     0              2       1736     0
>              Viewer     0              1        544     0
>
> --- Event Stage 1: decomposition
>
>
> --- Event Stage 2: Matrix Create
>
>   IS L to G Mapping     1              0          0     0
>              Matrix     3              0          0     0
>
> --- Event Stage 3: Matrix fill
>
>           Index Set     4              4     152680     0
>                 Vec     7              1       1304     0
>         Vec Scatter     2              0          0     0
>
> --- Event Stage 4: Solver setup
>
>           Index Set     3              2       1008     0
>              Matrix     3              0          0     0
>                 Vec     2              1    1773664     0
>         Vec Scatter     1              0          0     0
>       Krylov Solver     1              0          0     0
>      Preconditioner     1              0          0     0
>
> --- Event Stage 5: RHS setup
>
>                 Vec     2              0          0     0
>              Viewer     1              0          0     0
>
> --- Event Stage 6: Solve
>
>           Index Set     2              2     947456     0
>                 Vec     1              0          0     0
>         Vec Scatter     1              0          0     0
>              Viewer     1              0          0     0
>
> --- Event Stage 7: Postproc
>
>           Index Set     0              1        504     0
>              Matrix     0              3    8870136     0
>                 Vec     0              4    7125104     0
>         Vec Scatter     0              2       1320     0
>       Krylov Solver     0              1        832     0
>      Preconditioner     0              1        728     0
>              Viewer     0              1        544     0
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> = 
> =================================================================[1]  
> PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850688
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [1] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374784
> ========================================
> Average time to get PetscTime(): 1.3113e-06
> Average time for MPI_Barrier(): 0.000144005
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> Average time for zero size MPI_Send(): 6.54459e-05
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm 1140850688
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [0] Petsc_DelTag(): Deleting tag data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user  
> MPI_Comm -2080374784
> #PETSc Option Table entries:
> -dump_data false
> -info
> -ksp_compute_singularvalues
> -ksp_rtol 1e-6
> -ksp_type preonly
> -ksp_view
> -log_summary
> -mat_ignore_lower_triangular
> -mat_mumps_icntl_14 200
> -mat_mumps_sym 1
> -mat_type sbaij
> -monitorname errormonitor.dat
> -nz 100
> -options_left
> -pc_factor_mat_solver_package mumps
> -pc_type cholesky
> -preload false
> -verb false
> #End o PETSc Option Table entries
> Compiled without FORTRAN kernels
> Compiled with full precision matrices (default)
> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8  
> sizeof(PetscScalar) 8
> Configure run at: Wed Oct  7 18:16:40 2009
> Configure options: --with-scalar-type=real --with-debugging=0 --with- 
> precision=double --with-cc=icc --with-fc=ifort --with-cxx=icpc -- 
> with-shared=0 --with-mpi=1 --with-external-packages-dir=/home/lux/ 
> csae1801/petsc/externalpackages --download-mpich=ifneeded --with- 
> scalapack=1 --download-scalapack=ifneeded --download-f-blas- 
> lapack=ifneeded --with-blacs=1 --download-blacs=ifneeded --with- 
> parmetis=1 --download-parmetis=ifneeded --with-mumps=1 --download- 
> mumps=ifneeded --with-hypre=1 --with-hypre-dir=/home/lux/csae1801/ 
> petsc/externalpackages/hypre-2.4.0b/src/hypre --with-spooles=1 -- 
> download-spooles=ifneeded --with-superlu_dist=1 --download- 
> superlu_dist=ifneeded PETSC_ARCH=linux32-intel-c-prod
> -----------------------------------------
> Libraries compiled on Mi 7. Okt 18:17:41 CEST 2009 on mat1.uibk.ac.at
> Machine characteristics: Linux mat1.uibk.ac.at 2.6.18-53.el5xen #1  
> SMP Wed Oct 10 16:48:44 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux
> Using PETSc directory: /home/lux/csae1801/petsc/petsc-3.0.0-p8
> Using PETSc arch: linux32-intel-c-prod
> -----------------------------------------
> Using C compiler: /home/lux/csae1801/petsc/petsc-3.0.0-p8/linux32- 
> intel-c-prod/bin/mpicc -O
> Using Fortran compiler: /home/lux/csae1801/petsc/petsc-3.0.0-p8/ 
> linux32-intel-c-prod/bin/mpif90 -O
> -----------------------------------------
> Using include paths: -I/home/lux/csae1801/petsc/petsc-3.0.0-p8/ 
> linux32-intel-c-prod/include -I/home/lux/csae1801/petsc/petsc-3.0.0- 
> p8/include -I/home/lux/csae1801/petsc/petsc-3.0.0-p8/linux32-intel-c- 
> prod/include -I/home/lux/csae1801/petsc/externalpackages/ 
> hypre-2.4.0b/src/hypre/include
> ------------------------------------------
> Using C linker: /home/lux/csae1801/petsc/petsc-3.0.0-p8/linux32- 
> intel-c-prod/bin/mpicc -O
> Using Fortran linker: /home/lux/csae1801/petsc/petsc-3.0.0-p8/ 
> linux32-intel-c-prod/bin/mpif90 -O
> Using libraries: -Wl,-rpath,/home/lux/csae1801/petsc/petsc-3.0.0-p8/ 
> linux32-intel-c-prod/lib -L/home/lux/csae1801/petsc/petsc-3.0.0-p8/ 
> linux32-intel-c-prod/lib -lpetscts -lpetscsnes -lpetscksp -lpetscdm - 
> lpetscmat -lpetscvec -lpetsc        -lX11 -Wl,-rpath,/home/lux/ 
> csae1801/petsc/petsc-3.0.0-p8/linux32-intel-c-prod/lib -L/home/lux/ 
> csae1801/petsc/petsc-3.0.0-p8/linux32-intel-c-prod/lib - 
> lsuperlu_dist_2.3 -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common  
> -lpord -lparmetis -lmetis -lscalapack -lblacs -Wl,-rpath,/home/lux/ 
> csae1801/petsc/externalpackages/hypre-2.4.0b/src/hypre/lib -L/home/ 
> lux/csae1801/petsc/externalpackages/hypre-2.4.0b/src/hypre/lib - 
> lHYPRE -lmpichcxx -lstdc++ -lspooles -lflapack -lfblas -lnsl -laio - 
> lrt -lPEPCF90 -ldl -L/home/lux/csae1801/petsc/petsc-3.0.0-p8/linux32- 
> intel-c-prod/lib -lmpich -lpthread -lrt -L/opt/intel/cmkl/10.1.1.019/ 
> lib/em64t -L/opt/intel/Compiler/11.0/074/ipp/em64t/lib -L/opt/intel/ 
> Compiler/11.0/074/mkl/lib/em64t -L/opt/intel/Compiler/11.0/074/tbb/ 
> em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib -L/opt/intel/Compiler/ 
> 11.0/074/lib/intel64 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -limf - 
> lsvml -lipgo -ldecimal -lirc -lgcc_s -lirc_s -lmpichf90 -lifport - 
> lifcore -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich  
> -lpthread -lrt -limf -lsvml -lipgo -ldecimal -lirc -lgcc_s -lirc_s - 
> ldl
> ------------------------------------------
> #PETSc Option Table entries:
> -dump_data false
> -info
> -ksp_compute_singularvalues
> -ksp_rtol 1e-6
> -ksp_type preonly
> -ksp_view
> -log_summary
> -mat_ignore_lower_triangular
> -mat_mumps_icntl_14 200
> -mat_mumps_sym 1
> -mat_type sbaij
> -monitorname errormonitor.dat
> -nz 100
> -options_left
> -pc_factor_mat_solver_package mumps
> -pc_type cholesky
> -preload false
> -verb false
> #End o PETSc Option Table entries
> There are no unused options.
>
>                     1olarr
>
>              before sol put
>              sol put succes
>              before fullsol
> /DIANA/AP/LS41    12:08:39     35.66-CPU    11.16-IO   SOLVE
> /DIANA/AP/LS41    12:08:39     35.66-CPU    11.18-IO   FILLUP
> /DIANA/AP/LS41    12:08:39     35.66-CPU    11.18-IO   FILLUP
> /DIANA/AP/LS41    12:08:39     35.66-CPU    11.18-IO   POST
> /DIANA/AP/LS41    12:08:41     36.79-CPU    12.27-IO   POST
> /DIANA/AP/LS41    12:08:41     36.79-CPU    12.27-IO   LINSTA
> /DIANA/DC/END     12:08:41     36.79-CPU    12.27-IO   STOP
> DIANA JOB 3446 finished
>
> 1 proc externalsolver stuck at 100% with 12%memory out of 12GB
> 1 proc externalsolver at 0%
>
> mpiexec -np 2 externalsolver
> [0] PetscInitialize(): PETSc successfully started: number of  
> processors = 2
> [0] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [1] PetscInitialize(): PETSc successfully started: number of  
> processors = 2
> [1] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetOption_MPISBAIJ(): Option SYMMETRIC ignored
> [0] MatStashScatterBegin_Private(): No of messages: 1
> [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 30544064
> [0] MatStashScatterBegin_Private(): No of messages: 0
> [0] MatAssemblyBegin_MPISBAIJ(): Stash has 3818007 entries,uses 8  
> mallocs.
> [0] MatAssemblyBegin_MPISBAIJ(): Block-Stash has 3818007 entries,  
> uses 8 mallocs.
> [0] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 221545 X 221545, block  
> size 1; storage space: 16676676 unneeded, 5477824 used
> [0] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 62
> [1] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 225367 X 225367, block  
> size 1; storage space: 16793278 unneeded, 5743422 used
> [1] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483645
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483644
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483643
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate():   returning tag 2147483634
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [0] PetscCommDuplicate():   returning tag 2147483630
> [0] PetscCommDuplicate():   returning tag 2147483625
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483642
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483641
> [0] PetscCommDuplicate[1] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros  
> blocks in any row is 65
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483645
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483644
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483643
> [1] PetscCommDuplicate():   returning tag 2147483639
> [1] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate():   returning tag 2147483630
> [1] PetscCommDuplicate():   returning tag 2147483625
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483642
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483641
> [1] PetscCommDuplicate():   returning tag 2147483620
> [1] PetscC():   returning tag 2147483620
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to MPI
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483640
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483638
> [0] MatAssemblyEnd_SeqBAIJ(): Matrix size: 221545 X 6275, block size  
> 1; storage space: 22060506 unneeded, 93994 used
> [0] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 32
> [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 216534)/ 
> (num_localrows 221545) > 0.6. Use CompressedRow routines.
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetsommDuplicate(): Using internal PETSc communicator 1140850689  
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483640
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483639
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483638
> [1] MatAssemblyEnd_SeqBAIJ(): Matrix size: 225367 X 0, block size 1;  
> storage space: 22536700 unneeded, 0 used
> [1] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [1] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 0
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 225367)/ 
> (num_localrows 225367) > 0.6. Use CompressedRow routines.
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483614
> [1] PetscCommDuplicate():   returning tag 2147483613
> [1]before KSPsetup
> [1] PetscCommDuplicate(): Using internal PcCommDuplicate():    
> returning tag 2147483614
> [0] PetscCommDuplicate():   returning tag 2147483613
> [0]before KSPsetup
> [0] PCSetUp(): Setting up new PC
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483637
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483636
> [0] PetscCommDuplicate():   returning tag 2147483612
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483635
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483634
> ETSc communicator 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483637
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483636
> [1] PetscCommDuplicate():   returning tag 2147483612
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483635
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483634
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483633
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483632
> [0] PetscCommDuplicate():   returning tag 2147483607
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483633
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483632
> [1] PetscCommDuplicate():   returning tag 2147483607
> [0] PetscCommDuplicate():   returning tag 2147483602
> [0] VecScatterCreate(): Special case: processor zero gets entire  
> parallel vector, rest get none
> [1] PetscCommDuplicate():   returning tag 2147483602
>
> DMUMPS 4.8.4
> L D L^T Solver for symmetric positive definite matrices
> Type of parallelism: Working host
>
> ****** ANALYSIS STEP ********
>
> Density: NBdense, Average, Median   =    0   49   50
> Ordering based on METIS
> ** Peak of sequential stack size (number of real entries)   :  
> 16763905.
> A root of estimated size         2965  has been selected for  
> Scalapack.
>
> Leaving analysis phase with  ...
> INFOG(1)                                       =               0
> INFOG(2)                                       =               0
> -- (20) Number of entries in factors (estim.) =       120657071
> --  (3) Storage of factors  (REAL, estimated) =       137364268
> --  (4) Storage of factors  (INT , estimated) =         6138804
> --  (5) Maximum frontal size      (estimated) =            3705
> --  (6) Number of nodes in the tree           =           21863
> --  (7) Ordering option effectively used      =               5
> ICNTL(6) Maximum transversal option            =               0
> ICNTL(7) Pivot order option                    =               7
> Percentage of memory relaxation (effective)    =             200
> Number of level 2 nodes                        =               1
> Number of split nodes                          =               0
> RINFO(1) Operations during elimination (estim) =   1.114D+11
> Distributed matrix entry format (ICNTL(18))    =               3
> ** Rank of proc needing largest memory in IC facto        :         0
> ** Estimated corresponding MBYTES for IC facto            :      2085
> ** Estimated avg. MBYTES per work. proc at facto (IC)     :      2044
> ** TOTAL     space in MBYTES for IC factorization         :      4088
> ** Rank of proc needing largest memory for OOC facto      :         1
> ** Estimated corresponding MBYTES for OOC facto           :       644
> ** Estimated avg. MBYTES per work. proc at facto (OOC)    :       630
> ** TOTAL     space in MBYTES for OOC factorization        :      1261
>
> ****** FACTORIZATION STEP ********
>
>
> GLOBAL STATISTICS PRIOR NUMERICAL FACTORIZATION ...
> NUMBER OF WORKING PROCESSES          =           2
> REAL SPACE FOR FACTORS               =   137364268
> INTEGER SPACE FOR FACTORS            =     6138804
> MAXIMUM FRONTAL SIZE (ESTIMATED)     =        3705
> NUMBER OF NODES IN THE TREE          =       21863
> Maximum effective relaxed size of S              =   239060000
> Average effective relaxed size of S              =   234725641
>
> REDISTRIB: TOTAL DATA LOCAL/SENT     =     9086109     2238491
> GLOBAL TIME FOR MATRIX DISTRIBUTION  =      0.8155
> ** Memory relaxation parameter ( ICNTL(14)  )            :       200
> ** Rank of processor needing largest memory in facto     :         0
> ** Space in MBYTES used by this processor for facto      :      2085
> ** Avg. Space in MBYTES per working proc during facto    :      2044
>
> ELAPSED TIME FOR FACTORIZATION       =     28.2595
> Maximum effective space used in S    (KEEP(67)   =    77516871
> Average effective space used in S    (KEEP(67)   =    75395654
> ** EFF Min: Rank of processor needing largest memory :         0
> ** EFF Min: Space in MBYTES used by this processor   :       741
> ** EFF Min: Avg. Space in MBYTES per working proc    :       720
>
> GLOBAL STATISTICS
> RINFOG(2)  OPERATIONS DURING NODE ASSEMBLY     = 2.565D+08
> ------(3)  OPERATIONS DURING NODE ELIMINATION  = 1.114D+11
> INFOG (9)  REAL SPACE FOR FACTORS              =   137364268
> INFOG(10)  INTEGER SPACE FOR FACTORS           =     6138809
> INFOG(11)  MAXIMUM FRONT SIZE                  =        3705
> INFOG(29)  NUMBER OF ENTRIES IN FACTORS        =   116259976
> INFOG(12) NB OF NEGATIVE PIVOTS          =           0
> INFOG(14)  NUMBER OF MEMORY COMPRESS           =           0
> [0]after KSPsetup
> [0]RHS setup
> [1]after KSPsetup
> [1]RHS setup
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483601
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483601
> [1] PetscCommDuplicate():   returning tag 2147483596
> [0] PetscCommDuplicate():   returning tag 2147483596
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483591
> [0] PetscCommDuplicate():   returning tag 2147483591
> [0] VecAssemblyBegin_MPI(): Stash has 225367 entries, uses 12 mallocs.
> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
> [0] PetscCommDuplicate():   returning tag 2147483590
> [1] PetscCommDuplicate():   returning tag 2147483590
> [1] PetscCommDuplicate():   returning tag 2147483589
> [0] PetscCommDuplicate():   returning tag 2147483589
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483631
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483630
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483631
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483630
>
> 4 processes of externalsolver stuck with 0% cpu
>
>
> mpiexec -np 4 externalsolver
> [1] PetscInitialize(): PETSc successfully started: number of  
> processors = 4
> [1] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [3] PetscInitialize(): PETSc successfully started: number of  
> processors = 4
> [3] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [0] PetscInitialize(): PETSc successfully started: number of  
> processors = 4
> [0] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [2] PetscInitialize(): PETSc successfully started: number of  
> processors = 4
> [2] PetscInitialize(): Running on machine: mat1.uibk.ac.at
> [3] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [3] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [2] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [2] PetscCommDuplicate():   returning tag 2147483647
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [2] PetscCommDuplicate():   returning tag 2147483646
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [3] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688  
> -2080374784 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483646
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [3] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [3] PetscCommDuplicate():   returning tag 2147483647
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483646
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483646
> [1] PetscCommDuplicate():   returning tag 2147483647
> [2] PetscCommDuplicate(): Duplicating a communicator 1140850689  
> -2080374783 max tags = 2147483647
> [2] PetscCommDuplicate():   returning tag 2147483647
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483646
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] MatSetOption_MPISBAIJ(): Option SYMMETRIC ignored
> [0] MatStashScatterBegin_Private(): No of messages: 3
> [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 11261544
> [0] MatStashScatterBegin_Private(): Mesg_to: 2: size: 10384840
> [0] MatStashScatterBegin_Private(): Mesg_to: 3: size: 393344
> [0] MatStashScatterBegin_Private(): No of messages: 0
> [0] MatAssemblyBegin_MPISBAIJ(): Stash has 2754963 entries,uses 8  
> mallocs.
> [0] MatAssemblyBegin_MPISBAIJ(): Block-Stash has 2754963 entries,  
> uses 8 mallocs.
> [3] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 112413 X 112413, block  
> size 1; storage space: 8404901 unneeded, 2836399 used
> [3] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [3] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 65
> [1] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 113299 X 113299, block  
> size 1; storage space: 8647559 unneeded, 2682341 used
> [1] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [1] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 62
> [0] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 112871 X 112871, block  
> size 1; storage space: 8670798 unneeded, 2616302 used
> [0] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 52
> [2] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 108329 X 108329, block  
> size 1; storage space: 8247737 unneeded, 2585163 used
> [2] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [2] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 52
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483645
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483644
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483643
> [1] PetscCommDuplicate():   returning tag 2147483639
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483645
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483644
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483643
> [3] PetscCommDuplicate():   returning tag 2147483639
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483645
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483644
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483643
> [2] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483645
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483644
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483643
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate():   returning tag 2147483634
> [2] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate():   returning tag 2147483634
> [3] PetscCommDuplicate():   returning tag 2147483634
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [0] PetscCommDuplicate():   returning tag 2147483630
> [0] PetscCommDuplicate():   returning tag 2147483625
> [1] PetscCommDuplicate():   returning tag 2147483630
> [1] PetscCommDuplicate():   returning tag 2147483625
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483642
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483641
> [1] PetscCommDuplicate():   returning tag 2147483620
> [2] PetscCommDuplicate():   returning tag 2147483630
> [2] PetscCommDuplicate():   returning tag 2147483625
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483642
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483641
> [2] PetscCommDuplicate():   returning tag 2147483620
> [3] PetscCommDuplicate():   returning tag 2147483630
> [3] PetscCommDuplicate():   returning tag 2147483625
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483642
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483641
> [3] PetscCommDuplicate():   returning tag 2147483620
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483640
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483639
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483638
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483642
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483641
> [0] PetscCommDuplicate():   returning tag 2147483620
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to MPI
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483640
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483639
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483638
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483640
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483639
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483638
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483640
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483639
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483638
> [3] MatAssemblyEnd_SeqBAIJ(): Matrix size: 112413 X 0, block size 1;  
> storage space: 11241300 unneeded, 0 used
> [3] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [3] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 0
> [3] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 112413)/ 
> (num_localrows 112413) > 0.6. Use CompressedRow routines.
> [1] MatAssemblyEnd_SeqBAIJ(): Matrix size: 113299 X 2952, block size  
> 1; storage space: 11286236 unneeded, 43664 used
> [1] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [1] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 29
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 110902)/ 
> (num_localrows 113299) > 0.6. Use CompressedRow routines.
> [2] MatAssemblyEnd_SeqBAIJ(): Matrix size: 108329 X 9258, block size  
> 1; storage space: 10624060 unneeded, 208840 used
> [2] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [2] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 43
> [2] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 92865)/ 
> (num_localrows 108329) > 0.6. Use CompressedRow routines.
> [0] MatAssemblyEnd_SeqBAIJ(): Matrix size: 112871 X 17849, block  
> size 1; storage space: 10944569 unneeded, 342531 used
> [0] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues  
> is 0
> [0] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 43
> [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 92260)/ 
> (num_localrows 112871) > 0.6. Use CompressedRow routines.
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483614
> [0] PetscCommDuplicate():   returning tag 2147483613
> [0]before KSPsetup
> [0] PCSetUp(): Setting up new PC
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483637
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483636
> [0] PetscCommDuplicate():   returning tag 2147483612
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483614
> [1] PetscCommDuplicate():   returning tag 2147483613
> [1]before KSPsetup
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483637
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483636
> [1] PetscCommDuplicate():   returning tag 2147483612
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [2] PetscCommDuplicate():   returning tag 2147483614
> [2] PetscCommDuplicate():   returning tag 2147483613
> [2]before KSPsetup
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483637
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483636
> [2] PetscCommDuplicate():   returning tag 2147483612
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483635
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483634
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483635
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483634
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850688 -2080374784
> [3] PetscCommDuplicate():   returning tag 2147483614
> [3] PetscCommDuplicate():   returning tag 2147483613
> [3]before KSPsetup
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483637
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483636
> [3] PetscCommDuplicate():   returning tag 2147483612
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483635
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483635
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483634
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483633
> [1] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483632
> [1] PetscCommDuplicate():   returning tag 2147483607
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483633
> [2] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [2] PetscCommDuplicate():   returning tag 2147483632
> [2] PetscCommDuplicate():   returning tag 2147483607
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483633
> [0] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [0] PetscCommDuplicate():   returning tag 2147483632
> [0] PetscCommDuplicate():   returning tag 2147483607
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483633
> [3] PetscCommDuplicate(): Using internal PETSc communicator  
> 1140850689 -2080374783
> [3] PetscCommDuplicate():   returning tag 2147483632
> [3] PetscCommDuplicate():   returning tag 2147483607
> [0] PetscCommDuplicate():   returning tag 2147483602
> [1] PetscCommDuplicate():   returning tag 2147483602
> [2] PetscCommDuplicate():   returning tag 2147483602
> [3] PetscCommDuplicate():   returning tag 2147483602
> [0] VecScatterCreate(): Special case: processor zero gets entire  
> parallel vector, rest get none
>
> DMUMPS 4.8.4
> L D L^T Solver for symmetric positive definite matrices
> Type of parallelism: Working host
>
> ****** ANALYSIS STEP ********
>
> Density: NBdense, Average, Median   =    0   49   50
> Ordering based on METIS
> ** Peak of sequential stack size (number of real entries)   :  
> 16763905.
> A root of estimated size         2965  has been selected for  
> Scalapack.
>
> Leaving analysis phase with  ...
> INFOG(1)                                       =               0
> INFOG(2)                                       =               0
> -- (20) Number of entries in factors (estim.) =       120657071
> --  (3) Storage of factors  (REAL, estimated) =       137362626
> --  (4) Storage of factors  (INT , estimated) =         6167135
> --  (5) Maximum frontal size      (estimated) =            3705
> --  (6) Number of nodes in the tree           =           21863
> --  (7) Ordering option effectively used      =               5
> ICNTL(6) Maximum transversal option            =               0
> ICNTL(7) Pivot order option                    =               7
> Percentage of memory relaxation (effective)    =             200
> Number of level 2 nodes                        =               5
> Number of split nodes                          =               0
> RINFO(1) Operations during elimination (estim) =   1.114D+11
> Distributed matrix entry format (ICNTL(18))    =               3
> ** Rank of proc needing largest memory in IC facto        :         0
> ** Estimated corresponding MBYTES for IC facto            :      1112
> ** Estimated avg. MBYTES per work. proc at facto (IC)     :      1083
> ** TOTAL     space in MBYTES for IC factorization         :      4333
> ** Rank of proc needing largest memory for OOC facto      :         1
> ** Estimated corresponding MBYTES for OOC facto           :       465
> ** Estimated avg. MBYTES per work. proc at facto (OOC)    :       421
> ** TOTAL     space in MBYTES for OOC factorization        :      1684
>
> ****** FACTORIZATION STEP ********
>
>
> GLOBAL STATISTICS PRIOR NUMERICAL FACTORIZATION ...
> NUMBER OF WORKING PROCESSES          =           4
> REAL SPACE FOR FACTORS               =   137362626
> INTEGER SPACE FOR FACTORS            =     6167135
> MAXIMUM FRONTAL SIZE (ESTIMATED)     =        3705
> NUMBER OF NODES IN THE TREE          =       21863
> Maximum effective relaxed size of S              =   125678965
> Average effective relaxed size of S              =   122625031
>
> REDISTRIB: TOTAL DATA LOCAL/SENT     =     2497290     8939630
> GLOBAL TIME FOR MATRIX DISTRIBUTION  =      0.6324
> ** Memory relaxation parameter ( ICNTL(14)  )            :       200
> ** Rank of processor needing largest memory in facto     :         0
> ** Space in MBYTES used by this processor for facto      :      1112
> ** Avg. Space in MBYTES per working proc during facto    :      1083
>



More information about the petsc-users mailing list