[petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m

Gaurish Telang gaurish108 at gmail.com
Mon Jan 17 16:46:20 CST 2011


Hi.

I had two questions

(1)

I was curious to know why the following happens with the PETSc standard
output. Having created the executable 'test' when I try to run it with
mpiexec -n 2 ./test
the same output is printed to the terminal twice. If I use 3 processors,
then the same output is printed thrice.

In short the number of processors = number of times the output from PETSc is
printed. Could this be a mistake with my PETSc installation???

For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c
After creating ex23 the executable and running it with two processors gives
the following terminal output:

gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
mpiexec -n 1 ./ex23
KSP Object:
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object:
  type: jacobi
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=50
      not using I-node routines
Norm of error < 1.e-12, Iterations 5
gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
mpiexec -n 2 ./ex23
KSP Object:
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object:
  type: jacobi
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=50
      not using I-node routines
Norm of error < 1.e-12, Iterations 5
KSP Object:
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object:
  type: jacobi
  linear system matrix = precond matrix:
  Matrix Object:
    type=seqaij, rows=10, cols=10
    total: nonzeros=28, allocated nonzeros=50
      not using I-node routines
Norm of error < 1.e-12, Iterations 5
gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$




(2)

Also I was told yesterday on the PETSC users mailing list that the MATLAB m
file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary
format.
    The following are the comments in the code near the heading saying that
it works only for square sparse matrices . But it seems to be working quite
well for rectangular sparse MATLAB matrices also.
I have tested this in conjunction with PetscBinaryRead.m also, which reads
in a Petsc binary file into MATLAB as a sparse matrix.

Is there something I might have missed or some error that I might be
making???

Comments in PetscBinaryWrite.m
"-================================================
%  Writes in PETSc binary file sparse matrices and vectors
%  if the array is multidimensional and dense it is saved
%  as a one dimensional array
%
%  Only works for square sparse matrices
%:
..
..
..
..
..
..
.
.
.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110117/d28656fb/attachment.htm>


More information about the petsc-users mailing list