4 processes of externalsolver stuck with 0% cpu mpiexec -np 4 externalsolver [1] PetscInitialize(): PETSc successfully started: number of processors = 4 [1] PetscInitialize(): Running on machine: mat1.uibk.ac.at [3] PetscInitialize(): PETSc successfully started: number of processors = 4 [3] PetscInitialize(): Running on machine: mat1.uibk.ac.at [0] PetscInitialize(): PETSc successfully started: number of processors = 4 [0] PetscInitialize(): Running on machine: mat1.uibk.ac.at [2] PetscInitialize(): PETSc successfully started: number of processors = 4 [2] PetscInitialize(): Running on machine: mat1.uibk.ac.at [3] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [3] PetscCommDuplicate(): returning tag 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [2] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [2] PetscCommDuplicate(): returning tag 2147483647 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [2] PetscCommDuplicate(): returning tag 2147483646 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [3] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483646 [0] MatSetUpPreallocation(): Warning not preallocating matrix storage [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [3] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [3] PetscCommDuplicate(): returning tag 2147483647 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): returning tag 2147483647 [2] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [2] PetscCommDuplicate(): returning tag 2147483647 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483646 [0] MatSetOption_MPISBAIJ(): Option SYMMETRIC ignored [0] MatStashScatterBegin_Private(): No of messages: 3 [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 11261544 [0] MatStashScatterBegin_Private(): Mesg_to: 2: size: 10384840 [0] MatStashScatterBegin_Private(): Mesg_to: 3: size: 393344 [0] MatStashScatterBegin_Private(): No of messages: 0 [0] MatAssemblyBegin_MPISBAIJ(): Stash has 2754963 entries,uses 8 mallocs. [0] MatAssemblyBegin_MPISBAIJ(): Block-Stash has 2754963 entries, uses 8 mallocs. [3] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 112413 X 112413, block size 1; storage space: 8404901 unneeded, 2836399 used [3] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues is 0 [3] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 65 [1] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 113299 X 113299, block size 1; storage space: 8647559 unneeded, 2682341 used [1] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues is 0 [1] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 62 [0] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 112871 X 112871, block size 1; storage space: 8670798 unneeded, 2616302 used [0] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues is 0 [0] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 52 [2] MatAssemblyEnd_SeqSBAIJ(): Matrix size: 108329 X 108329, block size 1; storage space: 8247737 unneeded, 2585163 used [2] MatAssemblyEnd_SeqSBAIJ(): Number of mallocs during MatSetValues is 0 [2] MatAssemblyEnd_SeqSBAIJ(): Most nonzeros blocks in any row is 52 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483645 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483644 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483643 [1] PetscCommDuplicate(): returning tag 2147483639 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483645 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483644 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483643 [3] PetscCommDuplicate(): returning tag 2147483639 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483645 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483644 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483643 [2] PetscCommDuplicate(): returning tag 2147483639 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483645 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483644 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483643 [0] PetscCommDuplicate(): returning tag 2147483639 [0] PetscCommDuplicate(): returning tag 2147483634 [2] PetscCommDuplicate(): returning tag 2147483634 [1] PetscCommDuplicate(): returning tag 2147483634 [3] PetscCommDuplicate(): returning tag 2147483634 [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter [0] VecScatterCreate(): General case: MPI to Seq [0] PetscCommDuplicate(): returning tag 2147483630 [0] PetscCommDuplicate(): returning tag 2147483625 [1] PetscCommDuplicate(): returning tag 2147483630 [1] PetscCommDuplicate(): returning tag 2147483625 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483642 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483641 [1] PetscCommDuplicate(): returning tag 2147483620 [2] PetscCommDuplicate(): returning tag 2147483630 [2] PetscCommDuplicate(): returning tag 2147483625 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483642 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483641 [2] PetscCommDuplicate(): returning tag 2147483620 [3] PetscCommDuplicate(): returning tag 2147483630 [3] PetscCommDuplicate(): returning tag 2147483625 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483642 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483641 [3] PetscCommDuplicate(): returning tag 2147483620 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483640 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483639 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483638 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483642 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483641 [0] PetscCommDuplicate(): returning tag 2147483620 [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter [0] VecScatterCreate(): General case: MPI to MPI [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483640 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483639 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483638 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483640 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483639 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483638 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483640 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483639 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483638 [3] MatAssemblyEnd_SeqBAIJ(): Matrix size: 112413 X 0, block size 1; storage space: 11241300 unneeded, 0 used [3] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues is 0 [3] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 0 [3] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 112413)/(num_localrows 112413) > 0.6. Use CompressedRow routines. [1] MatAssemblyEnd_SeqBAIJ(): Matrix size: 113299 X 2952, block size 1; storage space: 11286236 unneeded, 43664 used [1] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues is 0 [1] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 29 [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 110902)/(num_localrows 113299) > 0.6. Use CompressedRow routines. [2] MatAssemblyEnd_SeqBAIJ(): Matrix size: 108329 X 9258, block size 1; storage space: 10624060 unneeded, 208840 used [2] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues is 0 [2] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 43 [2] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 92865)/(num_localrows 108329) > 0.6. Use CompressedRow routines. [0] MatAssemblyEnd_SeqBAIJ(): Matrix size: 112871 X 17849, block size 1; storage space: 10944569 unneeded, 342531 used [0] MatAssemblyEnd_SeqBAIJ(): Number of mallocs during MatSetValues is 0 [0] MatAssemblyEnd_SeqBAIJ(): Most nonzeros blocks in any row is 43 [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 92260)/(num_localrows 112871) > 0.6. Use CompressedRow routines. [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483614 [0] PetscCommDuplicate(): returning tag 2147483613 [0]before KSPsetup [0] PCSetUp(): Setting up new PC [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483637 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483636 [0] PetscCommDuplicate(): returning tag 2147483612 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483614 [1] PetscCommDuplicate(): returning tag 2147483613 [1]before KSPsetup [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483637 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483636 [1] PetscCommDuplicate(): returning tag 2147483612 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [2] PetscCommDuplicate(): returning tag 2147483614 [2] PetscCommDuplicate(): returning tag 2147483613 [2]before KSPsetup [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483637 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483636 [2] PetscCommDuplicate(): returning tag 2147483612 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483635 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483634 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483635 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483634 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [3] PetscCommDuplicate(): returning tag 2147483614 [3] PetscCommDuplicate(): returning tag 2147483613 [3]before KSPsetup [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483637 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483636 [3] PetscCommDuplicate(): returning tag 2147483612 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483635 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483634 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483635 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483634 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483633 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483632 [1] PetscCommDuplicate(): returning tag 2147483607 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483633 [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [2] PetscCommDuplicate(): returning tag 2147483632 [2] PetscCommDuplicate(): returning tag 2147483607 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483633 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483632 [0] PetscCommDuplicate(): returning tag 2147483607 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483633 [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [3] PetscCommDuplicate(): returning tag 2147483632 [3] PetscCommDuplicate(): returning tag 2147483607 [0] PetscCommDuplicate(): returning tag 2147483602 [1] PetscCommDuplicate(): returning tag 2147483602 [2] PetscCommDuplicate(): returning tag 2147483602 [3] PetscCommDuplicate(): returning tag 2147483602 [0] VecScatterCreate(): Special case: processor zero gets entire parallel vector, rest get none DMUMPS 4.8.4 L D L^T Solver for symmetric positive definite matrices Type of parallelism: Working host ****** ANALYSIS STEP ******** Density: NBdense, Average, Median = 0 49 50 Ordering based on METIS ** Peak of sequential stack size (number of real entries) : 16763905. A root of estimated size 2965 has been selected for Scalapack. Leaving analysis phase with ... INFOG(1) = 0 INFOG(2) = 0 -- (20) Number of entries in factors (estim.) = 120657071 -- (3) Storage of factors (REAL, estimated) = 137362626 -- (4) Storage of factors (INT , estimated) = 6167135 -- (5) Maximum frontal size (estimated) = 3705 -- (6) Number of nodes in the tree = 21863 -- (7) Ordering option effectively used = 5 ICNTL(6) Maximum transversal option = 0 ICNTL(7) Pivot order option = 7 Percentage of memory relaxation (effective) = 200 Number of level 2 nodes = 5 Number of split nodes = 0 RINFO(1) Operations during elimination (estim) = 1.114D+11 Distributed matrix entry format (ICNTL(18)) = 3 ** Rank of proc needing largest memory in IC facto : 0 ** Estimated corresponding MBYTES for IC facto : 1112 ** Estimated avg. MBYTES per work. proc at facto (IC) : 1083 ** TOTAL space in MBYTES for IC factorization : 4333 ** Rank of proc needing largest memory for OOC facto : 1 ** Estimated corresponding MBYTES for OOC facto : 465 ** Estimated avg. MBYTES per work. proc at facto (OOC) : 421 ** TOTAL space in MBYTES for OOC factorization : 1684 ****** FACTORIZATION STEP ******** GLOBAL STATISTICS PRIOR NUMERICAL FACTORIZATION ... NUMBER OF WORKING PROCESSES = 4 REAL SPACE FOR FACTORS = 137362626 INTEGER SPACE FOR FACTORS = 6167135 MAXIMUM FRONTAL SIZE (ESTIMATED) = 3705 NUMBER OF NODES IN THE TREE = 21863 Maximum effective relaxed size of S = 125678965 Average effective relaxed size of S = 122625031 REDISTRIB: TOTAL DATA LOCAL/SENT = 2497290 8939630 GLOBAL TIME FOR MATRIX DISTRIBUTION = 0.6324 ** Memory relaxation parameter ( ICNTL(14) ) : 200 ** Rank of processor needing largest memory in facto : 0 ** Space in MBYTES used by this processor for facto : 1112 ** Avg. Space in MBYTES per working proc during facto : 1083