[petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m

Gaurish Telang gaurish108 at gmail.com
Mon Jan 17 18:47:30 CST 2011


So hydra begins automatically while using mpiexec right? Meaning I don't
have to manually enter in "hydra &" at the terminal??

Thanks,

gaurish

On Mon, Jan 17, 2011 at 7:36 PM, Gaurish Telang <gaurish108 at gmail.com>wrote:

> Thank you that seems to have worked! Just to confirm,  I have posted the
> output at the end of this message. I hope this is the way the generic output
> should look like.
>
> I have still have a few questions though.
>
> (1)
>
> So all I need to do, is to specify the location of the correct executable
> mpiexec which is $PETSC_DIR/$PETSC_ARCH/bin/mpiexec
> while running the program, right?
>
> The contents of my $PETSC_DIR/$PETSC_ARCH/bin are
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/bin$
> ls
> mpicc  mpich2version  mpiexec  mpif77  mpif90  parkill
>
>
> (2)
> Do I need to make any changes in the makefiles of the PETSc programs that I
> have written? And hence recompile my codes by using the "new" mpiexec
>
> I mean, since  $PETSC_DIR/$PETSC_ARCH/bin/   also contains mpicc (as seen
> above), I want to be sure that the correct mpicc is being used during
> execution.
>
> (3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that
> I had installed prior to my PETSc it required me type "mpd &"
> before program execution.
>
> But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in
> ?? I mean I am not sure if this affects program performance
>
>
> Sincere thanks,
>
> Gaurish
>
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info
> [0] PetscInitialize(): PETSc successfully started: number of processors = 2
> [1] PetscInitialize(): PETSc successfully started: number of processors = 2
> [1] PetscGetHostName(): Rejecting domainname, likely is NIS
> gaurish108-laptop.(none)
> [1] PetscInitialize(): Running on machine: gaurish108-laptop
>
> [0] PetscGetHostName(): Rejecting domainname, likely is NIS
> gaurish108-laptop.(none)
> [0] PetscInitialize(): Running on machine: gaurish108-laptop
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
> max tags = 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
> max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
>
> [0] PetscCommDuplicate():   returning tag 2147483642
> [1] PetscCommDuplicate():   returning tag 2147483642
>
> [0] PetscCommDuplicate():   returning tag 2147483637
> [1] PetscCommDuplicate():   returning tag 2147483637
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483632
> [1] PetscCommDuplicate():   returning tag 2147483632
>
> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783
> max tags = 2147483647
>
> [0] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783
> max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483646
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] MatStashScatterBegin_Private(): No of messages: 0
> [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12
> unneeded,13 used
>
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
> [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12
> unneeded,13 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
> [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines
> [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483645
> [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483645
> [1] PetscCommDuplicate():   returning tag 2147483628
> [0] PetscCommDuplicate():   returning tag 2147483628
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483644
> [0] PetscCommDuplicate():   returning tag 2147483627
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483644
> [1] PetscCommDuplicate():   returning tag 2147483627
> [1] PetscCommDuplicate():   returning tag 2147483622
> [0] PetscCommDuplicate():   returning tag 2147483622
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9
> unneeded,1 used
>
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
> [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows
> 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9
> unneeded,1 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows
> 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483618
> [0] PetscCommDuplicate():   returning tag 2147483618
> [0] PetscCommDuplicate():   returning tag 2147483617
> [1] PetscCommDuplicate():   returning tag 2147483617
> [1] PetscCommDuplicate():   returning tag 2147483616
> [0] PetscCommDuplicate():   returning tag 2147483616
> [0] PetscCommDuplicate():   returning tag 2147483611
> [1] PetscCommDuplicate():   returning tag 2147483611
> [1] PetscCommDuplicate():   returning tag 2147483606
> [0] PetscCommDuplicate():   returning tag 2147483606
> [0] PetscCommDuplicate():   returning tag 2147483601
> [1] PetscCommDuplicate():   returning tag 2147483601
> [1] PetscCommDuplicate():   returning tag 2147483596
> [0] PetscCommDuplicate():   returning tag 2147483596
>
> [0] PCSetUp(): Setting up new PC
> [1] PetscCommDuplicate():   returning tag 2147483591
> [0] PetscCommDuplicate():   returning tag 2147483591
> [0] PetscCommDuplicate():   returning tag 2147483586
> [1] PetscCommDuplicate():   returning tag 2147483586
> [0] PetscCommDuplicate():   returning tag 2147483581
> [1] PetscCommDuplicate():   returning tag 2147483581
> [0] PetscCommDuplicate():   returning tag 2147483576
> [1] PetscCommDuplicate():   returning tag 2147483576
>
> [0] PetscCommDuplicate():   returning tag 2147483571
> [1] PetscCommDuplicate():   returning tag 2147483571
> [1] PetscCommDuplicate():   returning tag 2147483566
> [0] PetscCommDuplicate():   returning tag 2147483566
>
> [0] PetscCommDuplicate():   returning tag 2147483561
> [1] PetscCommDuplicate():   returning tag 2147483561
> [0] PetscCommDuplicate():   returning tag 2147483556
>
> [0] PetscCommDuplicate():   returning tag 2147483551
> [0] PetscCommDuplicate():   returning tag 2147483546
> [0] PetscCommDuplicate():   returning tag 2147483541
> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm
> 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand
> side norm 0.707107 at iteration 5
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483536
>
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=mpiaij, rows=10, cols=10
>     tot[1] PetscCommDuplicate():   returning tag 2147483556
> [1] PetscCommDuplicate():   returning tag 2147483551
> [1] PetscCommDuplicate():   returning tag 2147483546
> [1] PetscCommDuplicate():   returning tag 2147483541
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483536
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850689
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374783
> al: nonzeros=28, allocated nonzeros=70
>       not using I-node (on process 0) routines
>
> Norm of error < 1.e-12, Iterations 5
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850689
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374783
> [0] PetscFinalize(): PetscFinalize() called
> [1] PetscFinalize(): PetscFinalize() called
> [1] PetscCommDuplicate():   returning tag 2147483535
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850688
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483535
>
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850688
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374784
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374784
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> clear
>
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> $PETSC_DIR/$PETSC_ARCH/bin/mpich2version -n 2 ./ex23 -info
> Unrecognized argument -n
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info
> [0] PetscInitialize(): PETSc successfully started: number of processors = 2
> [1] PetscInitialize(): PETSc successfully started: number of processors = 2
>
> [0] PetscGetHostName(): Rejecting domainname, likely is NIS
> gaurish108-laptop.(none)
> [0] PetscInitialize(): Running on machine: gaurish108-laptop
> [1] PetscGetHostName(): Rejecting domainname, likely is NIS
> gaurish108-laptop.(none)
> [1] PetscInitialize(): Running on machine: gaurish108-laptop
>
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
> max tags = 2147483647
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784
> max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate():   returning tag 2147483647
>
> [0] PetscCommDuplicate():   returning tag 2147483642
> [0] PetscCommDuplicate():   returning tag 2147483637
> [1] PetscCommDuplicate():   returning tag 2147483642
> [1] PetscCommDuplicate():   returning tag 2147483637
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483632
> [0] PetscCommDuplicate():   returning tag 2147483632
>
> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783
> max tags = 2147483647
>
> [0] PetscCommDuplicate():   returning tag 2147483647
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483646
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783
> max tags = 2147483647
> [1] PetscCommDuplicate():   returning tag 2147483647
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483646
> [0] MatStashScatterBegin_Private(): No of messages: 0
> [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12
> unneeded,13 used
>
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
> [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483645
> [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter
>
> [0] PetscCommDuplicate():   returning tag 2147483628
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
>
> [0] PetscCommDuplicate():   returning tag 2147483644
> [0] PetscCommDuplicate():   returning tag 2147483627
> [0] PetscCommDuplicate():   returning tag 2147483622
> [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12
> unneeded,13 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
> [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483645
> [1] PetscCommDuplicate():   returning tag 2147483628
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
> -2080374783
> [1] PetscCommDuplicate():   returning tag 2147483644
> [1] PetscCommDuplicate():   returning tag 2147483627
> [1] PetscCommDuplicate():   returning tag 2147483622
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9
> unneeded,1 used
>
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
> [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows
> 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9
> unneeded,1 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows
> 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate():   returning tag 2147483618
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483618
> [0] PetscCommDuplicate():   returning tag 2147483617
> [0] PetscCommDuplicate():   returning tag 2147483616
> [1] PetscCommDuplicate():   returning tag 2147483617
> [1] PetscCommDuplicate():   returning tag 2147483616
> [1] PetscCommDuplicate():   returning tag 2147483611
> [1] PetscCommDuplicate():   returning tag 2147483606
> [0] PetscCommDuplicate():   returning tag 2147483611
> [0] PetscCommDuplicate():   returning tag 2147483606
> [0] PetscCommDuplicate():   returning tag 2147483601
> [0] PetscCommDuplicate():   returning tag 2147483596
>
> [0] PCSetUp(): Setting up new PC
> [1] PetscCommDuplicate():   returning tag 2147483601
> [1] PetscCommDuplicate():   returning tag 2147483596
> [1] PetscCommDuplicate():   returning tag 2147483591
> [0] PetscCommDuplicate():   returning tag 2147483591
> [0] PetscCommDuplicate():   returning tag 2147483586
> [0] PetscCommDuplicate():   returning tag 2147483581
> [0] PetscCommDuplicate():   returning tag 2147483576
> [1] PetscCommDuplicate():   returning tag 2147483586
> [1] PetscCommDuplicate():   returning tag 2147483581
> [1] PetscCommDuplicate():   returning tag 2147483576
> [1] PetscCommDuplicate():   returning tag 2147483571
> [0] PetscCommDuplicate():   returning tag 2147483571
>
> [0] PetscCommDuplicate():   returning tag 2147483566
> [0] PetscCommDuplicate():   returning tag 2147483561
> [1] PetscCommDuplicate():   returning tag 2147483566
> [1] PetscCommDuplicate():   returning tag 2147483561
> [1] PetscCommDuplicate():   returning tag 2147483556
>
> [0] PetscCommDuplicate():   returning tag 2147483556
> [0] PetscCommDuplicate():   returning tag 2147483551
> [1] PetscCommDuplicate():   returning tag 2147483551
> [1] PetscCommDuplicate():   returning tag 2147483546
>
> [0] PetscCommDuplicate():   returning tag 2147483546
> [0] PetscCommDuplicate():   returning tag 2147483541
> [1] PetscCommDuplicate():   returning tag 2147483541
> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm
> 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand
> side norm 0.707107 at iteration 5
>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
> -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483536
>
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=mpiaij, rows=10, cols=10
> [1] PetscCommDuplicate():   returning tag 2147483536
>     total: nonzeros=28, allocated nonzeros=70
>       not using I-node (on process 0) routines
>
> Norm of error < 1.e-12, Iterations 5
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850689
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374783
> [0] PetscFinalize(): PetscFinalize() called
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850689
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783
> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374783
> [1] PetscFinalize(): PetscFinalize() called
> [1] PetscCommDuplicate():   returning tag 2147483535
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850688
> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374784
> [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] PetscCommDuplicate():   returning tag 2147483535
>
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm 1140850688
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
> MPI_Comm -2080374784
> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> On Mon, Jan 17, 2011 at 6:20 PM, Gaurish Telang <gaurish108 at gmail.com>wrote:
>
>> This is what I get on running mpiexec -n 2 ./ex23 -info
>>
>> Also,  using mpirun in place of mpiexec and using the -info option I get
>> the exact same output  you see below.
>>
>> As far as the MPI implmentation I am using,  I have OpenMPI and MPICH
>> installed on my laptop.
>>
>> While installing PETSc there were some external packages required. In the
>> external packages folder I can see the following softwares:
>>
>> fblaslapack-3.1.1  mpich2-1.0.8  ParMetis-dev-p3  SuperLU_DIST_2.4-hg-v2
>>
>> Possibly it is this mpich2 that should be used??
>> Please let me know what I should do. I am quite new to PETSc.
>>
>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>> mpiexec -n 2 ./ex23 -info
>> [0] PetscInitialize(): PETSc successfully started: number of processors =
>> 1
>> [0] PetscGetHostName(): Rejecting domainname, likely is NIS
>> gaurish108-laptop.(none)
>> [0] PetscInitialize(): Running on machine: gaurish108-laptop
>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688
>> -2080374784 max tags = 2147483647
>> [0] PetscCommDuplicate():   returning tag 2147483647
>> [0] PetscCommDuplicate():   returning tag 2147483646
>> [0] PetscCommDuplicate():   returning tag 2147483645
>> [0] PetscInitialize(): PETSc successfully started: number of processors =
>> 1
>> [0] PetscGetHostName(): Rejecting domainname, likely is NIS
>> gaurish108-laptop.(none)
>> [0] PetscInitialize(): Running on machine: gaurish108-laptop
>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688
>> -2080374784 max tags = 2147483647
>> [0] PetscCommDuplicate():   returning tag 2147483647
>> [0] PetscCommDuplicate():   returning tag 2147483646
>> [0] PetscCommDuplicate():   returning tag 2147483645
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483644
>> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
>> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22
>> unneeded,28 used
>> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
>> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
>> [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode
>> routines
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483643
>> [0] PetscCommDuplicate():   returning tag 2147483642
>> [0] PetscCommDuplicate():   returning tag 2147483641
>> [0] PetscCommDuplicate():   returning tag 2147483640
>> [0] PetscCommDuplicate():   returning tag 2147483639
>> [0] PetscCommDuplicate():   returning tag 2147483638
>> [0] PetscCommDuplicate():   returning tag 2147483637
>> [0] PCSetUp(): Setting up new PC
>> [0] PetscCommDuplicate():   returning tag 2147483636
>> [0] PetscCommDuplicate():   returning tag 2147483635
>> [0] PetscCommDuplicate():   returning tag 2147483634
>> [0] PetscCommDuplicate():   returning tag 2147483633
>> [0] PetscCommDuplicate():   returning tag 2147483632
>> [0] PetscCommDuplicate():   returning tag 2147483631
>> [0] PetscCommDuplicate():   returning tag 2147483630
>> [0] PetscCommDuplicate():   returning tag 2147483629
>> [0] PetscCommDuplicate():   returning tag 2147483628
>> [0] PetscCommDuplicate():   returning tag 2147483627
>> [0] PetscCommDuplicate():   returning tag 2147483626
>> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm
>> 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand
>> side norm 0.707107 at iteration 5
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483625
>>
>> KSP Object:
>>   type: gmres
>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>>     GMRES: happy breakdown tolerance 1e-30
>>   maximum iterations=10000, initial guess is zero
>>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>>   left preconditioning
>>   using PRECONDITIONED norm type for convergence test
>> PC Object:
>>   type: jacobi
>>   linear system matrix = precond matrix:
>>   Matrix Object:
>>     type=seqaij, rows=10, cols=10
>>     total: nonzeros=28, allocated nonzeros=50
>>       not using I-node routines
>> Norm of error < 1.e-12, Iterations 5
>>  [0] PetscFinalize(): PetscFinalize() called
>> [0] PetscCommDuplicate():   returning tag 2147483624
>> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
>> MPI_Comm 1140850688
>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
>> MPI_Comm -2080374784
>> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483644
>> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage
>> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22
>> unneeded,28 used
>> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
>> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
>> [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode
>> routines
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483643
>> [0] PetscCommDuplicate():   returning tag 2147483642
>> [0] PetscCommDuplicate():   returning tag 2147483641
>> [0] PetscCommDuplicate():   returning tag 2147483640
>> [0] PetscCommDuplicate():   returning tag 2147483639
>> [0] PetscCommDuplicate():   returning tag 2147483638
>> [0] PetscCommDuplicate():   returning tag 2147483637
>> [0] PCSetUp(): Setting up new PC
>> [0] PetscCommDuplicate():   returning tag 2147483636
>> [0] PetscCommDuplicate():   returning tag 2147483635
>> [0] PetscCommDuplicate():   returning tag 2147483634
>> [0] PetscCommDuplicate():   returning tag 2147483633
>> [0] PetscCommDuplicate():   returning tag 2147483632
>> [0] PetscCommDuplicate():   returning tag 2147483631
>> [0] PetscCommDuplicate():   returning tag 2147483630
>> [0] PetscCommDuplicate():   returning tag 2147483629
>> [0] PetscCommDuplicate():   returning tag 2147483628
>> [0] PetscCommDuplicate():   returning tag 2147483627
>> [0] PetscCommDuplicate():   returning tag 2147483626
>> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm
>> 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand
>> side norm 0.707107 at iteration 5
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] PetscCommDuplicate():   returning tag 2147483625
>>
>> KSP Object:
>>   type: gmres
>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>>     GMRES: happy breakdown tolerance 1e-30
>>   maximum iterations=10000, initial guess is zero
>>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>>   left preconditioning
>>   using PRECONDITIONED norm type for convergence test
>> PC Object:
>>   type: jacobi
>>   linear system matrix = precond matrix:
>>   Matrix Object:
>>     type=seqaij, rows=10, cols=10
>>     total: nonzeros=28, allocated nonzeros=50
>>       not using I-node routines
>> Norm of error < 1.e-12, Iterations 5
>>  [0] PetscFinalize(): PetscFinalize() called
>> [0] PetscCommDuplicate():   returning tag 2147483624
>> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
>> MPI_Comm 1140850688
>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784
>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784
>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user
>> MPI_Comm -2080374784
>> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784
>>
>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>>
>>
>> On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang <gaurish108 at gmail.com>wrote:
>>
>>> Hi.
>>>
>>> I had two questions
>>>
>>> (1)
>>>
>>> I was curious to know why the following happens with the PETSc standard
>>> output. Having created the executable 'test' when I try to run it with
>>> mpiexec -n 2 ./test
>>> the same output is printed to the terminal twice. If I use 3 processors,
>>> then the same output is printed thrice.
>>>
>>> In short the number of processors = number of times the output from PETSc
>>> is printed. Could this be a mistake with my PETSc installation???
>>>
>>> For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c
>>> After creating ex23 the executable and running it with two processors gives
>>> the following terminal output:
>>>
>>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>>> mpiexec -n 1 ./ex23
>>> KSP Object:
>>>   type: gmres
>>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>     GMRES: happy breakdown tolerance 1e-30
>>>   maximum iterations=10000, initial guess is zero
>>>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>>>   left preconditioning
>>>   using PRECONDITIONED norm type for convergence test
>>> PC Object:
>>>   type: jacobi
>>>   linear system matrix = precond matrix:
>>>   Matrix Object:
>>>     type=seqaij, rows=10, cols=10
>>>     total: nonzeros=28, allocated nonzeros=50
>>>       not using I-node routines
>>> Norm of error < 1.e-12, Iterations 5
>>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>>> mpiexec -n 2 ./ex23
>>> KSP Object:
>>>   type: gmres
>>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>     GMRES: happy breakdown tolerance 1e-30
>>>   maximum iterations=10000, initial guess is zero
>>>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>>>   left preconditioning
>>>   using PRECONDITIONED norm type for convergence test
>>> PC Object:
>>>   type: jacobi
>>>   linear system matrix = precond matrix:
>>>   Matrix Object:
>>>     type=seqaij, rows=10, cols=10
>>>     total: nonzeros=28, allocated nonzeros=50
>>>       not using I-node routines
>>> Norm of error < 1.e-12, Iterations 5
>>> KSP Object:
>>>   type: gmres
>>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>     GMRES: happy breakdown tolerance 1e-30
>>>   maximum iterations=10000, initial guess is zero
>>>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>>>   left preconditioning
>>>   using PRECONDITIONED norm type for convergence test
>>> PC Object:
>>>   type: jacobi
>>>   linear system matrix = precond matrix:
>>>   Matrix Object:
>>>     type=seqaij, rows=10, cols=10
>>>     total: nonzeros=28, allocated nonzeros=50
>>>       not using I-node routines
>>> Norm of error < 1.e-12, Iterations 5
>>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>>>
>>>
>>>
>>>
>>> (2)
>>>
>>> Also I was told yesterday on the PETSC users mailing list that the MATLAB
>>> m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc
>>> Binary format.
>>>     The following are the comments in the code near the heading saying
>>> that it works only for square sparse matrices . But it seems to be working
>>> quite well for rectangular sparse MATLAB matrices also.
>>> I have tested this in conjunction with PetscBinaryRead.m also, which
>>> reads in a Petsc binary file into MATLAB as a sparse matrix.
>>>
>>> Is there something I might have missed or some error that I might be
>>> making???
>>>
>>> Comments in PetscBinaryWrite.m
>>> "-================================================
>>> %  Writes in PETSc binary file sparse matrices and vectors
>>> %  if the array is multidimensional and dense it is saved
>>> %  as a one dimensional array
>>> %
>>> %  Only works for square sparse matrices
>>> %:
>>> ..
>>> ..
>>> ..
>>> ..
>>> ..
>>> ..
>>> .
>>> .
>>> .
>>>
>>>
>>>
>>>
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110117/ca09470f/attachment-0001.htm>


More information about the petsc-users mailing list