Parallel matrix assembly - SetValues Problem?

Bernhard Kubicek bernhard.kubicek at arsenal.ac.at
Tue Feb 19 04:12:45 CST 2008


Dear List,

sorry to bother you  but I just finished reading the whole archive and
couldn't find a solution to a problem of mine that keeps on bothering me
now for 7+ days.

The problem is that my code produces different matrices if run in
parallel or single cpu.

I do a manual partitioning of the mesh by using metis by hand.
Thereafter, there is a list of finite-volume elements that I want to be
stored on the individual cpu and a renumbering that is manged somehow.
I create my matrix with

MatCreateMPIAIJ(PETSC_COMM_WORLD,mycount,mycount,
PETSC_DETERMINE,PETSC_DETERMINE,50,PETSC_NULL,50,PETSC_NULL,&A),

where mycount is different on each cpu, and is the mentioned number of
elements I wish to have there locally.


for each local row/element at the same time let a user calculate the
matrix elements and column positions, and the right hand side values for
this row. I output those for debugging. Within the loop for the local
rows, I call
MatSetValues(A,1,&i,nrEntries,entries,v,INSERT_VALUES) 
VecSetValue(rhs,i,rhsval,INSERT_VALUES)
in this order

When I run on one cpu, everything works nicely. A 3d mesh of a
10-element long bar with each element having volume 1, creates the
following matrix:
 -2.  1.  0.  0.  0.  0.  0.  0.  0.  1. 
 1.  -2.  1.  0.  0.  0.  0.  0.  0.  0. 
 0.  1.  -2.  1.  0.  0.  0.  0.  0.  0. 
 0.  0.  1.  -2.  1.  0.  0.  0.  0.  0. 
 0.  0.  0.  1.  -3.  0.  0.  0.  0.  0. 
 0.  0.  0.  0.  0.  -3.  1.  0.  0.  0. 
 0.  0.  0.  0.  0.  1.  -2.  1.  0.  0. 
 0.  0.  0.  0.  0.  0.  1.  -2.  1.  0. 
 0.  0.  0.  0.  0.  0.  0.  1.  -2.  1. 
 1.  0.  0.  0.  0.  0.  0.  0.  1.  -2. 

rhs
0
0
0
0
0
-2
0
0
0
0
The CPU sets the matrix and rhs like this ( global matrix row:
column/value ...column/value | rhs-value )
Row 0 cols:0/-2	9/1 1/1		|	0
Row 1 cols:1/-2	0/1 2/1		|	0
Row 2 cols:2/-2	1/1 3/1		|	0
Row 3 cols:3/-2	2/1 4/1		|	0
Row 4 cols:4/-3	3/1	|	0
Row 5 cols:5/-3	6/1	|	-2
Row 6 cols:6/-2	5/1 7/1		|	0
Row 7 cols:7/-2	6/1 8/1		|	0
Row 8 cols:8/-2	7/1 9/1		|	0
Row 9 cols:9/-2	0/1 8/1		|	0
because of the meshing the central rows in the matrix are the most
exterior elements, on which wall boundary condition 0 and 1 are set
(laplace equation).

one 2 cpus,
the matrix looses is different, although the global-local element
renumbering is defacto nonexisting (cpu 0: rows 0-4, cpu 1: rows 5-9):
 1.  1.  0.  0.  0.  0.  0.  0.  0.  0. 
 1.  -2.  1.  0.  0.  0.  0.  0.  0.  0. 
 0.  1.  -2.  1.  0.  0.  0.  0.  0.  0. 
 0.  0.  1.  -2.  1.  0.  0.  0.  0.  0. 
 0.  0.  0.  1.  -3.  0.  0.  0.  0.  0. 
 0.  0.  0.  0.  0.  -3.  1.  0.  0.  0. 
 0.  0.  0.  0.  0.  1.  -2.  1.  0.  0. 
 0.  0.  0.  0.  0.  0.  1.  -2.  1.  0. 
 0.  0.  0.  0.  0.  0.  0.  1.  -2.  1. 
 1.  0.  0.  0.  0.  0.  0.  0.  1.  -2. 
Process [0]
0
0
0
0
0
Process [1]
-2
0
0
0
0
here cpu 0 sets:
Row 0 cols:0/-2	9/1 1/1	|0
Row 1 cols:1/-2	0/1 2/1	|0
Row 2 cols:2/-2	1/1 3/1	|0
Row 3 cols:3/-2	2/1 4/1	|0
Row 4 cols:4/-3	3/1 |0

and cpu 1:
Row 5 cols:5/-3	6/1|-2
Row 6 cols:6/-2	5/1 7/1	|0
Row 7 cols:7/-2	6/1 8/1	|0
Row 8 cols:8/-2	7/1 9/1	|0
Row 9 cols:9/-2	0/1 8/1	|0
I triple verified that ***SetValues is called with the exactly same
values as on one cpu, and that nothing is set twice, and that every cpu
sets it's correct columns. Also for more sophisticated renumberings

Attached are the outputs when run with with -info. 

My current guess is that I create the matrix falsely, or that I cannot
mix the setting of Vec and Mat values before their
respective ???AssemblyBegin/Ends.

If anyone has any idea where the problem is, is would be extremely nice
to help me here.

Thank you very much, even for the slightest help
 Bernhard Kubicek

------
Physics Doctorate Student Techn. University of Vienna, Austria
Freelancer arsenal research, Vienna Austria





-------------- next part --------------
[1] PetscInitialize(): PETSc successfully started: number of processors = 2
[1] PetscGetHostName(): Rejecting domainname, likely is NIS node99.(none)
[1] PetscInitialize(): Running on machine: node99
Trying to read Gmsh .msh file "stab.gmsh"
Gmsh2 file format recognised
[0] PetscInitialize(): PETSc successfully started: number of processors = 2
[0] PetscGetHostName(): Rejecting domainname, likely is NIS node99.(none)
[0] PetscInitialize(): Running on machine: node99
Read 44 nodes.
alltogether number of elements including faces, ignored:61
Trying to read Gmsh .msh file "stab.gmsh"
Gmsh2 file format recognised
Read 44 nodes.
alltogether number of elements including faces, ignored:61
[1] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823
[1] PetscCommDuplicate():   returning tag 1073741823
[1] PetscCommDuplicate(): Duplicating a communicator 92 143 max tags = 1073741823
[1] PetscCommDuplicate():   returning tag 1073741823
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741822
[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[1] PetscCommDuplicate():   returning tag 1073741820
[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741819
[0] PetscCommDuplicate():   returning tag 1073741814
[1] PetscCommDuplicate():   returning tag 1073741809
[1] MatStashScatterBegin_Private(): No of messages: 0 
[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
[1] MatStashScatterBegin_Private(): No of messages: 0 
[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 237 unneeded,13 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741821
[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter
[1] PetscCommDuplicate():   returning tag 1073741802
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741820
[1] PetscCommDuplicate():   returning tag 1073741801
[1] PetscCommDuplicate():   returning tag 1073741796
[1] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterCreate(): General case: MPI to Seq
[0] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 249 unneeded,1 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 0 unneeded,13 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741819
[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter
[1] PetscCommDuplicate():   returning tag 1073741793
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741818
[1] PetscCommDuplicate():   returning tag 1073741792
[1] PetscCommDuplicate():   returning tag 1073741787
[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterLocalOptimizeCopy_Private(): Local scatter is a copy, optimizing for it
[0] VecScatterCreate(): General case: MPI to Seq
[1] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 0 unneeded,1 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] Mat_CheckCompressedRow(): Skip check. m: 5, n: 1,M: 5, N: 1,nrows: 1, ii: 0x89153a0, type: seqaij
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
[0] Mat_CheckCompressedRow(): Skip check. m: 5, n: 1,M: 5, N: 1,nrows: 1, ii: 0x8917588, type: seqaij
[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.
[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.
[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741784
[1] PetscCommDuplicate():   returning tag 1073741783
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741817
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741816
[1] MatStashScatterBegin_Private(): No of messages: 1 
[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
[1] MatAssemblyBegin_MPIAIJ(): Stash has 14 entries, uses 0 mallocs.
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 133 unneeded,27 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 10
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 0 X 0; storage space: 0 unneeded,0 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0
[1] Mat_CheckInode(): Found 0 nodes of 0. Limit used: 5. Using Inode routines
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741815
[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter
[0] PetscCommDuplicate():   returning tag 1073741779
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741814
[1] PetscCommDuplicate():   returning tag 1073741814
[0] PetscCommDuplicate():   returning tag 1073741778
[1] PetscCommDuplicate():   returning tag 1073741773
[1] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterCreate(): General case: MPI to Seq
[1] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 0 X 0; storage space: 0 unneeded,0 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0
[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 10)/(num_localrows 10) > 0.6. Use CompressedRow routines.
[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 0) > 0.6. Use CompressedRow routines.
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741813
[1] PetscCommDuplicate():   returning tag 1073741813
 1.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -3.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  -3.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00 
 1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00 
[1] PetscCommDuplicate():   returning tag 1073741770
Process [0]
0
0
0
0
0
Process [1]
-2
0
0
0
0
[0] PetscCommDuplicate():   returning tag 1073741769
[1] PetscCommDuplicate():   returning tag 1073741769
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741768
[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741767
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[1] PetscCommDuplicate():   returning tag 1073741766
[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[1] PetscCommDuplicate():   returning tag 1073741765
[0] PetscCommDuplicate():   returning tag 1073741764
[1] PetscCommDuplicate():   returning tag 1073741763
[1] PetscCommDuplicate():   returning tag 1073741762
[0] PetscCommDuplicate():   returning tag 1073741761
[1] PetscCommDuplicate():   returning tag 1073741756
[0] PetscCommDuplicate():   returning tag 1073741751
[0] PetscCommDuplicate():   returning tag 1073741746
[0] PetscCommDuplicate():   returning tag 1073741741
[1] PetscCommDuplicate():   returning tag 1073741736
[0] PCSetUp(): Setting up new PC
[1] PetscCommDuplicate():   returning tag 1073741731
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741812
[1] MatIncreaseOverlap_MPIAIJ_Receive(): Allocated 1 bytes, required 3 bytes, no of mallocs = 0
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741811
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741810
[1] PetscCommDuplicate():   returning tag 1073741809
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741808
[1] PetscCommDuplicate():   returning tag 1073741722
[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterLocalOptimizeCopy_Private(): Local scatter is a copy, optimizing for it
[1] VecScatterLocalOptimizeCopy_Private(): Local scatter is a copy, optimizing for it
[0] VecScatterCreate(): General case: MPI to Seq
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741807
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741806
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741805
[1] PetscCommDuplicate():   returning tag 1073741805
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 6 X 6; storage space: 0 unneeded,16 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[1] Mat_CheckInode(): Found 6 nodes out of 6 rows. Not using Inode routines
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741804
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741803
[1] PetscCommDuplicate():   returning tag 1073741803
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[0] PetscCommDuplicate():   returning tag 1073741802
[1] PetscCommDuplicate():   returning tag 1073741802
[1] MatILUFactorSymbolic_SeqAIJ(): Reallocs 0 Fill ratio:given 50 needed 0.9375
[1] MatILUFactorSymbolic_SeqAIJ(): Run with -[sub_]pc_factor_fill 0.9375 or use 
[0] MatILUFactorSymbolic_SeqAIJ(): PCFactorSetFill([sub]pc,0.928571);
[0] MatILUFactorSymbolic_SeqAIJ(): for best performance.
[1] MatILUFactorSymbolic_SeqAIJ(): for best performance.
[0] PetscCommDuplicate():   returning tag 1073741801
[1] PetscCommDuplicate():   returning tag 1073741801
[1] Mat_CheckInode(): Found 6 nodes out of 6 rows. Not using Inode routines
[1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Detected zero pivot in LU factorization
see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#ZeroPivot!
[1]PETSC ERROR: Zero pivot row 5 value 0 tolerance 0 * rowsum 0!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 2.3.3, Patch 8, Fri Nov 16 17:03:40 CST 2007 HG revision: 414581156e67e55c761739b0deb119f7590d0f4b
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: /media/sdb1/bernhard/svnl/bogen/Cpp/newmesh/bin/./main on a omp_deb_m named node99 by bkubicek Tue Feb 19 11:04:15 2008
[1]PETSC ERROR: Libraries linked from /home/bkubicek/750/Software/petsc-2.3.3-p8/lib/omp_deb_mpi_cxx
[1]PETSC ERROR: Configure run at Thu Jan 31 10:02:09 2008
[1]PETSC ERROR: Configure options --with-clanguage=c++ --with-x=0 --with-debugging=1 --with-shared=0 --with-default-arch=0 --with-mpi=1 COPTFLAGS=' -O2 -march=pentium4 -mtune=pentium4 ' FOPTFLAGS='-I -O2 -march=pentium4 -mtune=pentium4 '
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 529 in src/mat/impls/aij/seq/aijfact.c
[1]PETSC ERROR: MatLUFactorNumeric() line 2227 in src/mat/interface/matrix.c
[1]PETSC ERROR: PCSetUp_ILU() line 564 in src/ksp/pc/impls/factor/ilu/ilu.c
[1]PETSC ERROR: PCSetUp() line 787 in src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSPSetUp() line 234 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: PCSetUpOnBlocks_ASM() line 224 in src/ksp/pc/impls/asm/asm.c
[1]PETSC ERROR: PCSetUpOnBlocks() line 820 in src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSPSetUpOnBlocks() line 158 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: KSPSolve() line 348 in src/ksp/ksp/interface/itfunc.c
[1] PCSetUp(): Setting up new PC
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741800
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741799
[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
[1] PetscCommDuplicate():   returning tag 1073741798
[1] MatILUFactorSymbolic_SeqAIJ(): Reallocs 0 Fill ratio:given 50 needed 0.9375
[1] MatILUFactorSymbolic_SeqAIJ(): Run with -[sub_]pc_factor_fill 0.9375 or use 
[1] MatILUFactorSymbolic_SeqAIJ(): PCFactorSetFill([sub]pc,0.9375);
[1] MatILUFactorSymbolic_SeqAIJ(): for best performance.
[1] PetscCommDuplicate():   returning tag 1073741797
[1] Mat_CheckInode(): Found 6 nodes out of 6 rows. Not using Inode routines
[1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Detected zero pivot in LU factorization
see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#ZeroPivot!
[1]PETSC ERROR: Zero pivot row 5 value 0 tolerance 0 * rowsum 0!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 2.3.3, Patch 8, Fri Nov 16 17:03:40 CST 2007 HG revision: 414581156e67e55c761739b0deb119f7590d0f4b
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: /media/sdb1/bernhard/svnl/bogen/Cpp/newmesh/bin/./main on a omp_deb_m named node99 by bkubicek Tue Feb 19 11:04:15 2008
[1]PETSC ERROR: Libraries linked from /home/bkubicek/750/Software/petsc-2.3.3-p8/lib/omp_deb_mpi_cxx
[1]PETSC ERROR: Configure run at Thu Jan 31 10:02:09 2008
[1]PETSC ERROR: Configure options --with-clanguage=c++ --with-x=0 --with-debugging=1 --with-shared=0 --with-default-arch=0 --with-mpi=1 COPTFLAGS=' -O2 -march=pentium4 -mtune=pentium4 ' FOPTFLAGS='-I -O2 -march=pentium4 -mtune=pentium4 '
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 529 in src/mat/impls/aij/seq/aijfact.c
[1]PETSC ERROR: MatLUFactorNumeric() line 2227 in src/mat/interface/matrix.c
[1]PETSC ERROR: PCSetUp_ILU() line 564 in src/ksp/pc/impls/factor/ilu/ilu.c
[1]PETSC ERROR: PCSetUp() line 787 in src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSPSetUp() line 234 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: PCSetUpOnBlocks_ASM() line 224 in src/ksp/pc/impls/asm/asm.c
[1]PETSC ERROR: PCSetUpOnBlocks() line 820 in src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSPSetUpOnBlocks() line 158 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: KSPSolve() line 348 in src/ksp/ksp/interface/itfunc.c
PETSC_ERROR: Line 250 File: matrix.cpp
Child process exited unexpectedly 0
Aborted (core dumped)
-------------- next part --------------
[0] PetscInitialize(): PETSc successfully started: number of processors = 1
[0] PetscGetHostName(): Rejecting domainname, likely is NIS node99.(none)
[0] PetscInitialize(): Running on machine: node99
Trying to read Gmsh .msh file "stab.gmsh"
Gmsh2 file format recognised
Read 44 nodes.
alltogether number of elements including faces, ignored:61
[0] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823
[0] PetscCommDuplicate():   returning tag 1073741823
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741822
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741821
[0] PetscCommDuplicate():   returning tag 1073741820
[0] PetscCommDuplicate():   returning tag 1073741819
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 472 unneeded,28 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 0 unneeded,28 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741818
 -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00 
 1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -3.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  -3.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00  0.00000e+00 
 0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00  1.00000e+00 
 1.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  -2.00000e+00 
[0] PetscCommDuplicate():   returning tag 1073741817
0
0
0
0
0
-2
0
0
0
0
[0] PetscCommDuplicate():   returning tag 1073741816
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741815
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741814
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741813
[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
[0] PetscCommDuplicate():   returning tag 1073741812
[0] PetscCommDuplicate():   returning tag 1073741811
[0] PetscCommDuplicate():   returning tag 1073741810
[0] PetscCommDuplicate():   returning tag 1073741809
[0] PetscCommDuplicate():   returning tag 1073741808
[0] PetscCommDuplicate():   returning tag 1073741807
[0] PetscCommDuplicate():   returning tag 1073741806
[0] PetscCommDuplicate():   returning tag 1073741805
[0] PetscCommDuplicate():   returning tag 1073741804
[0] PetscCommDuplicate():   returning tag 1073741803
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate():   returning tag 1073741802
[0] PetscCommDuplicate(): Duplicating a communicator 92 149 max tags = 1073741823
[0] PetscCommDuplicate():   returning tag 1073741823
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 92
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm 149
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 149
[0] Petsc_DelTag(): Deleting tag data in an MPI_Comm 149
[0] PetscCommDuplicate(): Duplicating a communicator 92 149 max tags = 1073741823
[0] PetscCommDuplicate():   returning tag 1073741823
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741822
[0] PetscCommDuplicate():   returning tag 1073741821
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741820
[0] PetscCommDuplicate():   returning tag 1073741801
[0] VecScatterCreate(): Special case: sequential vector general to stride
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741819
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741818
[0] PetscCommDuplicate():   returning tag 1073741800
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 0 unneeded,28 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741817
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741816
[0] PetscCommDuplicate(): Using internal PETSc communicator 92 149
[0] PetscCommDuplicate():   returning tag 1073741815
[0] MatILUFactorSymbolic_SeqAIJ(): Reallocs 0 Fill ratio:given 50 needed 1
[0] MatILUFactorSymbolic_SeqAIJ(): Run with -[sub_]pc_factor_fill 1 or use 
[0] MatILUFactorSymbolic_SeqAIJ(): PCFactorSetFill([sub]pc,1);
[0] MatILUFactorSymbolic_SeqAIJ(): for best performance.
[0] PetscCommDuplicate():   returning tag 1073741799
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines
[0] PetscCommDuplicate():   returning tag 1073741798
[0] KSPDefaultConverged(): user has provided nonzero initial guess, computing 2-norm of preconditioned RHS
  0 KSP Residual norm 3.146479381785e+02 
  1 KSP Residual norm 5.527758587503e-14 
  2 KSP Residual norm 2.957120339776e-14 
[0] PetscCommDuplicate():   returning tag 1073741797
[0] PetscCommDuplicate():   returning tag 1073741796
[0] PetscCommDuplicate():   returning tag 1073741795
[0] PetscCommDuplicate():   returning tag 1073741794
[0] PetscCommDuplicate():   returning tag 1073741793
[0] PetscCommDuplicate():   returning tag 1073741792
[0] PetscCommDuplicate():   returning tag 1073741791
[0] PetscCommDuplicate():   returning tag 1073741790
[0] PetscCommDuplicate():   returning tag 1073741789
[0] PetscCommDuplicate():   returning tag 1073741788
  3 KSP Residual norm 2.311953842658e-14 
  4 KSP Residual norm 1.949598161624e-14 
  5 KSP Residual norm 1.718379006549e-14 
  6 KSP Residual norm 1.553657802016e-14 
  7 KSP Residual norm 1.428735337435e-14 
  8 KSP Residual norm 1.329791896231e-14 
  9 KSP Residual norm 1.248915022855e-14 
 10 KSP Residual norm 1.181200860607e-14 
 11 KSP Residual norm 1.123427218264e-14 
 12 KSP Residual norm 1.073378041180e-14 
[0] PetscCommDuplicate():   returning tag 1073741787
[0] PetscCommDuplicate():   returning tag 1073741786
[0] PetscCommDuplicate():   returning tag 1073741785
[0] PetscCommDuplicate():   returning tag 1073741784
[0] PetscCommDuplicate():   returning tag 1073741783
[0] PetscCommDuplicate():   returning tag 1073741782
[0] PetscCommDuplicate():   returning tag 1073741781
[0] PetscCommDuplicate():   returning tag 1073741780
[0] PetscCommDuplicate():   returning tag 1073741779
[0] PetscCommDuplicate():   returning tag 1073741778
 13 KSP Residual norm 1.029472464285e-14 
 14 KSP Residual norm 9.905483966479e-15 
 15 KSP Residual norm 9.557299004528e-15 
 16 KSP Residual norm 9.243425362697e-15 
 17 KSP Residual norm 8.958574339974e-15 
 18 KSP Residual norm 8.698532377314e-15 
 19 KSP Residual norm 8.459895437148e-15 
 20 KSP Residual norm 8.239879423284e-15 
 21 KSP Residual norm 8.036182185186e-15 
 22 KSP Residual norm 7.846881299150e-15 
[0] PetscCommDuplicate():   returning tag 1073741777
[0] PetscCommDuplicate():   returning tag 1073741776
[0] PetscCommDuplicate():   returning tag 1073741775
[0] PetscCommDuplicate():   returning tag 1073741774
[0] PetscCommDuplicate():   returning tag 1073741773
[0] PetscCommDuplicate():   returning tag 1073741772
[0] PetscCommDuplicate():   returning tag 1073741771
[0] PetscCommDuplicate():   returning tag 1073741770
[0] PetscCommDuplicate():   returning tag 1073741769
[0] PetscCommDuplicate():   returning tag 1073741768
 23 KSP Residual norm 7.670357156941e-15 
 24 KSP Residual norm 7.505234275465e-15 
 25 KSP Residual norm 7.350335936204e-15 
 26 KSP Residual norm 7.204648718167e-15 
 27 KSP Residual norm 7.067294471298e-15 
 28 KSP Residual norm 6.937507953310e-15 
 29 KSP Residual norm 6.814618825309e-15 
 30 KSP Residual norm 6.698037036492e-15 
 31 KSP Residual norm 6.587240868887e-15 
 32 KSP Residual norm 6.481767088291e-15 
[0] PetscCommDuplicate():   returning tag 1073741767
[0] PetscCommDuplicate():   returning tag 1073741766
[0] PetscCommDuplicate():   returning tag 1073741765
[0] PetscCommDuplicate():   returning tag 1073741764
 33 KSP Residual norm 6.381202776463e-15 
 34 KSP Residual norm 6.285178515610e-15 
 35 KSP Residual norm 8.829047469026e-14 
[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 2.7398e-30 is less than absolute tolerance 1e-15 at iteration 36
 36 KSP Residual norm 2.739801485312e-30 
KSP Object:
  type: gmres
    GMRES: restart=35, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=1000
  tolerances:  relative=1e-15, absolute=1e-15, divergence=10000
  left preconditioning
PC Object:
  type: asm
    Additive Schwarz: total subdomain blocks = 1, amount of overlap = 1
    Additive Schwarz: restriction/interpolation type - RESTRICT
    Local solve is same for all blocks, in the following KSP and PC objects:
    KSP Object:(sub_)
      type: preonly
      maximum iterations=10000, initial guess is zero
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
      left preconditioning
    PC Object:(sub_)
      type: ilu
        ILU: 15 levels of fill
        ILU: factor fill ratio allocated 50
        ILU: tolerance for zero pivot 1e-12
             out-of-place factorization
             matrix ordering: rcm
        ILU: factor fill ratio needed 1
             Factored matrix follows
            Matrix Object:
              type=seqaij, rows=10, cols=10
              total: nonzeros=28, allocated nonzeros=28
                not using I-node routines
      linear system matrix = precond matrix:
      Matrix Object:
        type=seqaij, rows=10, cols=10
        total: nonzeros=28, allocated nonzeros=28
          not using I-node routines
[0] PetscCommDuplicate():   returning tag 1073741763
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=500
      not using I-node routines
[0] PetscCommDuplicate():   returning tag 1073741762
[0] PetscCommDuplicate():   returning tag 1073741761
[0] KSPDefaultConverged(): user has provided nonzero initial guess, computing 2-norm of preconditioned RHS
[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 5.27447e-16 is less than absolute tolerance 1e-15 at iteration 0
  0 KSP Residual norm 5.274472300304e-16 
KSP Object:
  type: gmres
    GMRES: restart=35, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=1000
  tolerances:  relative=1e-15, absolute=1e-15, divergence=10000
  left preconditioning
PC Object:
  type: asm
    Additive Schwarz: total subdomain blocks = 1, amount of overlap = 1
    Additive Schwarz: restriction/interpolation type - RESTRICT
    Local solve is same for all blocks, in the following KSP and PC objects:
    KSP Object:(sub_)
      type: preonly
      maximum iterations=10000, initial guess is zero
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
      left preconditioning
    PC Object:(sub_)
      type: ilu
        ILU: 15 levels of fill
        ILU: factor fill ratio allocated 50
        ILU: tolerance for zero pivot 1e-12
             out-of-place factorization
             matrix ordering: rcm
        ILU: factor fill ratio needed 1
             Factored matrix follows
            Matrix Object:
              type=seqaij, rows=10, cols=10
              total: nonzeros=28, allocated nonzeros=28
                not using I-node routines
      linear system matrix = precond matrix:
      Matrix Object:
        type=seqaij, rows=10, cols=10
        total: nonzeros=28, allocated nonzeros=28
          not using I-node routines
[0] PetscCommDuplicate():   returning tag 1073741760
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=500
      not using I-node routines
[0] PetscCommDuplicate():   returning tag 1073741759
Bnds: 2 NBnds:58
[0] PetscCommDuplicate():   returning tag 1073741758
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm 141
[0] PetscCommDuplicate():   returning tag 1073741757
OptionTable: -info
OptionTable: -ksp_atol 1.e-15
OptionTable: -ksp_gmres_restart 35
OptionTable: -ksp_max_it 1000
OptionTable: -ksp_monitor
OptionTable: -ksp_rtol 1.e-15
OptionTable: -ksp_view
OptionTable: -options_left
OptionTable: -pc_type asm
OptionTable: -sub_pc_factor_fill 50
OptionTable: -sub_pc_factor_levels 15
OptionTable: -sub_pc_factor_mat_ordering_type rcm
OptionTable: -sub_pc_type ilu
There are no unused options.


More information about the petsc-users mailing list