[petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m

Gaurish Telang gaurish108 at gmail.com
Mon Jan 17 17:20:19 CST 2011


This is what I get on running mpiexec -n 2 ./ex23 -info

Also,  using mpirun in place of mpiexec and using the -info option I get the
exact same output  you see below.

As far as the MPI implmentation I am using,  I have OpenMPI and MPICH
installed on my laptop.

While installing PETSc there were some external packages required. In the
external packages folder I can see the following softwares:

fblaslapack-3.1.1  mpich2-1.0.8  ParMetis-dev-p3  SuperLU_DIST_2.4-hg-v2

Possibly it is this mpich2 that should be used??
Please let me know what I should do. I am quite new to PETSc.

gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
mpiexec -n 2 ./ex23 -info
[0] PetscInitialize(): PETSc successfully started: number of processors = 1
[0] PetscGetHostName(): Rejecting domainname, likely is NIS
gaurish108-laptop.(none)
[0] PetscInitialize(): Running on machine: gaurish108-laptop
[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
max tags = 2147483647
[0] PetscCommDuplicate():   returning tag 2147483647
[0] PetscCommDuplicate():   returning tag 2147483646
[0] PetscCommDuplicate():   returning tag 2147483645
[0] PetscInitialize(): PETSc successfully started: number of processors = 1
[0] PetscGetHostName(): Rejecting domainname, likely is NIS
gaurish108-laptop.(none)
[0] PetscInitialize(): Running on machine: gaurish108-laptop
[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
max tags = 2147483647
[0] PetscCommDuplicate():   returning tag 2147483647
[0] PetscCommDuplicate():   returning tag 2147483646
[0] PetscCommDuplicate():   returning tag 2147483645
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483644
[0] MatSetUpPreallocation(): Warning not preallocating matrix storage
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22
unneeded,28 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode
routines
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483643
[0] PetscCommDuplicate():   returning tag 2147483642
[0] PetscCommDuplicate():   returning tag 2147483641
[0] PetscCommDuplicate():   returning tag 2147483640
[0] PetscCommDuplicate():   returning tag 2147483639
[0] PetscCommDuplicate():   returning tag 2147483638
[0] PetscCommDuplicate():   returning tag 2147483637
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate():   returning tag 2147483636
[0] PetscCommDuplicate():   returning tag 2147483635
[0] PetscCommDuplicate():   returning tag 2147483634
[0] PetscCommDuplicate():   returning tag 2147483633
[0] PetscCommDuplicate():   returning tag 2147483632
[0] PetscCommDuplicate():   returning tag 2147483631
[0] PetscCommDuplicate():   returning tag 2147483630
[0] PetscCommDuplicate():   returning tag 2147483629
[0] PetscCommDuplicate():   returning tag 2147483628
[0] PetscCommDuplicate():   returning tag 2147483627
[0] PetscCommDuplicate():   returning tag 2147483626
[0] KSPDefaultConverged(): Linear solver has converged. Residual norm
4.50879e-16 is less than relative tolerance 1e-07 times initial right hand
side norm 0.707107 at iteration 5
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483625
KSP Object:
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object:
  type: jacobi
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=50
      not using I-node routines
Norm of error < 1.e-12, Iterations 5
[0] PetscFinalize(): PetscFinalize() called
[0] PetscCommDuplicate():   returning tag 2147483624
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm
1140850688
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm
-2080374784
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483644
[0] MatSetUpPreallocation(): Warning not preallocating matrix storage
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22
unneeded,28 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode
routines
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483643
[0] PetscCommDuplicate():   returning tag 2147483642
[0] PetscCommDuplicate():   returning tag 2147483641
[0] PetscCommDuplicate():   returning tag 2147483640
[0] PetscCommDuplicate():   returning tag 2147483639
[0] PetscCommDuplicate():   returning tag 2147483638
[0] PetscCommDuplicate():   returning tag 2147483637
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate():   returning tag 2147483636
[0] PetscCommDuplicate():   returning tag 2147483635
[0] PetscCommDuplicate():   returning tag 2147483634
[0] PetscCommDuplicate():   returning tag 2147483633
[0] PetscCommDuplicate():   returning tag 2147483632
[0] PetscCommDuplicate():   returning tag 2147483631
[0] PetscCommDuplicate():   returning tag 2147483630
[0] PetscCommDuplicate():   returning tag 2147483629
[0] PetscCommDuplicate():   returning tag 2147483628
[0] PetscCommDuplicate():   returning tag 2147483627
[0] PetscCommDuplicate():   returning tag 2147483626
[0] KSPDefaultConverged(): Linear solver has converged. Residual norm
4.50879e-16 is less than relative tolerance 1e-07 times initial right hand
side norm 0.707107 at iteration 5
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
-2080374784
[0] PetscCommDuplicate():   returning tag 2147483625
KSP Object:
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object:
  type: jacobi
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=50
      not using I-node routines
Norm of error < 1.e-12, Iterations 5
[0] PetscFinalize(): PetscFinalize() called
[0] PetscCommDuplicate():   returning tag 2147483624
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm
1140850688
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm
-2080374784
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$


On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang <gaurish108 at gmail.com>wrote:

> Hi.
>
> I had two questions
>
> (1)
>
> I was curious to know why the following happens with the PETSc standard
> output. Having created the executable 'test' when I try to run it with
> mpiexec -n 2 ./test
> the same output is printed to the terminal twice. If I use 3 processors,
> then the same output is printed thrice.
>
> In short the number of processors = number of times the output from PETSc
> is printed. Could this be a mistake with my PETSc installation???
>
> For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c
> After creating ex23 the executable and running it with two processors gives
> the following terminal output:
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> mpiexec -n 1 ./ex23
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> mpiexec -n 2 ./ex23
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
>
>
>
> (2)
>
> Also I was told yesterday on the PETSC users mailing list that the MATLAB m
> file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary
> format.
>     The following are the comments in the code near the heading saying that
> it works only for square sparse matrices . But it seems to be working quite
> well for rectangular sparse MATLAB matrices also.
> I have tested this in conjunction with PetscBinaryRead.m also, which reads
> in a Petsc binary file into MATLAB as a sparse matrix.
>
> Is there something I might have missed or some error that I might be
> making???
>
> Comments in PetscBinaryWrite.m
> "-================================================
> %  Writes in PETSc binary file sparse matrices and vectors
> %  if the array is multidimensional and dense it is saved
> %  as a one dimensional array
> %
> %  Only works for square sparse matrices
> %:
> ..
> ..
> ..
> ..
> ..
> ..
> .
> .
> .
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110117/951198b4/attachment-0001.htm>


More information about the petsc-users mailing list