<div>Hi,</div>
<div> </div>
<div>i'm trying to solve my cfd code using PETSc in parallel. Besides the linear eqns for PETSc, other parts of the code has also been parallelized using MPI.</div>
<div> </div>
<div>however i find that the parallel version of the code running on 4 processors is even slower than the sequential version. </div>
<div> </div>
<div>in order to find out why, i've used the -info option to print out the details. there are 2 linear equations being solved - momentum and poisson. the momentum one is twice the size of the poisson. it is shown below:
</div>
<div> </div>
<div>[0] User provided function(): (Fortran):PETSc successfully started: procs 4<br>[1] User provided function(): (Fortran):PETSc successfully started: procs 4<br>[3] User provided function(): (Fortran):PETSc successfully started: procs 4
<br>[2] User provided function(): (Fortran):PETSc successfully started: procs 4<br>[0] PetscGetHostName(): Rejecting domainname, likely is NIS atlas2-c12.(none)<br>[0] User provided function(): Running on machine: atlas2-c12
<br>[1] PetscGetHostName(): Rejecting domainname, likely is NIS atlas2-c12.(none)<br>[1] User provided function(): Running on machine: atlas2-c12<br>[3] PetscGetHostName(): Rejecting domainname, likely is NIS atlas2-c08.(none)
<br>[3] User provided function(): Running on machine: atlas2-c08<br>[2] PetscGetHostName(): Rejecting domainname, likely is NIS atlas2-c08.(none)<br>[2] User provided function(): Running on machine: atlas2-c08<br>[0] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823
<br>[1] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823<br>[2] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823<br>[3] PetscCommDuplicate(): Duplicating a communicator 91 141 max tags = 1073741823
<br>[0] PetscCommDuplicate(): Duplicating a communicator 92 143 max tags = 1073741823<br>[2] PetscCommDuplicate(): Duplicating a communicator 92 143 max tags = 1073741823<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Duplicating a communicator 92 143 max tags = 1073741823
<br>[3] PetscCommDuplicate(): Duplicating a communicator 92 143 max tags = 1073741823<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br> 0 3200<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br> 3200 6400<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br> 6400 9600<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br> 9600 12800<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br> 3200 6400
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br> 9600 12800<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 91 141
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 91 141<br> 0 3200<br> 6400 9600<br>[1] MatStashScatterBegin_Private(): No of messages: 0<br>[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
<br>[3] MatStashScatterBegin_Private(): No of messages: 0<br>[3] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 3200; storage space: 4064 unneeded,53536 used
<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 3200; storage space: 4064 unneeded,53536 used<br>[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 18
<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 18<br>[3] Mat_CheckInode(): Found 1600 nodes of 3200. Limit used: 5. Using Inode routines
<br>[1] Mat_CheckInode(): Found 1600 nodes of 3200. Limit used: 5. Using Inode routines<br>[0] MatStashScatterBegin_Private(): No of messages: 0<br>[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[2] MatStashScatterBegin_Private(): No of messages: 0
<br>[2] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 3200; storage space: 4064 unneeded,53536 used<br>[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
<br>[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 18<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 3200; storage space: 3120 unneeded,54480 used<br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 18<br>[2] Mat_CheckInode(): Found 1600 nodes of 3200. Limit used: 5. Using Inode routines<br>[0] Mat_CheckInode(): Found 1600 nodes of 3200. Limit used: 5. Using Inode routines
<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[1] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 640; storage space: 53776 unneeded,3824 used<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6
<br>[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 2560)/(num_localrows 3200) > 0.6. Use CompressedRow routines.<br>[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[0] VecScatterCreate(): General case: MPI to Seq
<br>[0] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 320; storage space: 55688 unneeded,1912 used<br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6<br>[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 2880)/(num_localrows 3200) > 0.6. Use CompressedRow routines.<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6
<br>[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 2880)/(num_localrows 3200) > 0.6. Use CompressedRow routines.<br>[3] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[3] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
<br>[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 320; storage space: 55688 unneeded,1912 used<br>[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6
<br>[3] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 2880)/(num_localrows 3200) > 0.6. Use CompressedRow routines.<br>[2] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[2] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
<br>[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 3200 X 640; storage space: 53776 unneeded,3824 used<br>[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6
<br>[2] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 2560)/(num_localrows 3200) > 0.6. Use CompressedRow routines.<br>[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[2] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[3] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[3] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[3] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[3] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[0] PCSetUp(): Setting up new PC<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PCSetUp(): Setting up new PC
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PCSetUp(): Setting up new PC<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PCSetUp(): Setting up new PC<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PCSetUp(): Setting up new PC<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] KSPDefaultConverged(): user has provided nonzero initial guess, computing 2-norm of preconditioned RHS<br>[0] KSPDefaultConverged(): Linear solver has converged. Residual norm
1.00217e-05 is less than relative tolerance 1e-05 times initial right hand side norm 6.98447 at iteration 5<br>[0] MatStashScatterBegin_Private(): No of messages: 0<br>[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 1600; storage space: 774 unneeded,13626 used<br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9
<br>[0] Mat_CheckInode(): Found 1600 nodes out of 1600 rows. Not using Inode routines<br>[1] MatStashScatterBegin_Private(): No of messages: 0<br>[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 1600; storage space: 1016 unneeded,13384 used
<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>[1] Mat_CheckInode(): Found 1600 nodes out of 1600 rows. Not using Inode routines
<br>[2] MatStashScatterBegin_Private(): No of messages: 0<br>[2] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 1600; storage space: 1016 unneeded,13384 used
<br>[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>[2] Mat_CheckInode(): Found 1600 nodes out of 1600 rows. Not using Inode routines
<br>[3] MatStashScatterBegin_Private(): No of messages: 0<br>[3] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 1600; storage space: 1016 unneeded,13384 used
<br>[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>[3] Mat_CheckInode(): Found 1600 nodes out of 1600 rows. Not using Inode routines
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[2] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[2] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
<br>[2] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 320; storage space: 13444 unneeded,956 used<br>[2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] VecScatterCreate(): General case: MPI to Seq
<br>[2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[2] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 1280)/(num_localrows 1600) > 0.6. Use CompressedRow routines.<br>[0] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES
<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 160; storage space: 13922 unneeded,478 used<br>[0] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 160; storage space: 13922 unneeded,478 used
<br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 1440)/(num_localrows 1600) >
0.6. Use CompressedRow routines.<br>[3] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[3] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES<br>[3] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 160; storage space: 13922 unneeded,478 used
<br>[3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[3] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 1440)/(num_localrows 1600) >
0.6. Use CompressedRow routines.<br>[1] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[1] MatSetOption_Inode(): Not using Inode routines due to MatSetOption(MAT_DO_NOT_USE_INODES<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 1600 X 320; storage space: 13444 unneeded,956 used
<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 1280)/(num_localrows 1600) >
0.6. Use CompressedRow routines.<br>[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[3] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[3] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.
<br>[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[2] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[1] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[3] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs.<br>[3] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.
<br>[1] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143 <br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PCSetUp(): Setting up new PC<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PCSetUp(): Setting up new PC<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PCSetUp(): Setting up new PC<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PCSetUp(): Setting up new PC<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PCSetUp(): Setting up new PC<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143
<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PCSetUp(): Setting up new PC
<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 92 143<br>[0] KSPDefaultConverged(): Linear solver has converged. Residual norm
8.84097e-05 is less than relative tolerance 1e-05 times initial right hand side norm 8.96753 at iteration 212<br> 1 1.000000000000000E-002 1.15678640520876<br> 0.375502846664950<br> </div>
<div> </div>
<div>i saw some statements stating "seq". am i running in sequential or parallel mode? have i preallocated too much space?</div>
<div> </div>
<div>lastly, if Ax=b, A_sta and A_end from <a href="file:///D:/PhD/CODES/petsc-2.3.2-p6/docs/manualpages/Mat/MatGetOwnershipRange.html#MatGetOwnershipRange">MatGetOwnershipRange</a> and b_sta and b_end from <a href="file:///D:/PhD/CODES/petsc-2.3.2-p6/docs/manualpages/Vec/VecGetOwnershipRange.html#VecGetOwnershipRange">
VecGetOwnershipRange</a> should always be the same value, right?</div>
<div> </div>
<div>Thank you.</div>
<div> </div>