So hydra begins automatically while using mpiexec right? Meaning I don't have to manually enter in "hydra &" at the terminal?? <br><br>Thanks,<br><br>gaurish <br><br><div class="gmail_quote">On Mon, Jan 17, 2011 at 7:36 PM, Gaurish Telang <span dir="ltr"><<a href="mailto:gaurish108@gmail.com">gaurish108@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">Thank you that seems to have worked! Just to confirm, I have posted the output at the end of this message. I hope this is the way the generic output should look like.<br>
<br>I have still have a few questions though.<br><br>
(1)<br><br>So all I need to do, is to specify the location of the correct executable mpiexec which is $PETSC_DIR/$PETSC_ARCH/bin/mpiexec <br>while running the program, right? <br><br>The contents of my $PETSC_DIR/$PETSC_ARCH/bin are <br>
<br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/bin$ ls<br>mpicc mpich2version mpiexec mpif77 mpif90 parkill<br><br><br>(2)<br>Do I need to make any changes in the makefiles of the PETSc programs that I have written? And hence recompile my codes by using the "new" mpiexec <br>
<br>I mean, since $PETSC_DIR/$PETSC_ARCH/bin/ also contains mpicc (as seen above), I want to be sure that the correct mpicc is being used during execution. <br><br>(3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that I had installed prior to my PETSc it required me type "mpd &"<br>
before program execution.<br><br>But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in ?? I mean I am not sure if this affects program performance <br><br><br>Sincere thanks,<br><br>Gaurish<br><br><br>
gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info<br>[0] PetscInitialize(): PETSc successfully started: number of processors = 2<br>
[1] PetscInitialize(): PETSc successfully started: number of processors = 2<br>[1] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[1] PetscInitialize(): Running on machine: gaurish108-laptop<div class="im">
<br>
[0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[0] PetscInitialize(): Running on machine: gaurish108-laptop<br>[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>
</div>
[1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>[1] PetscCommDuplicate(): returning tag 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483647<div class="im">
<br>[0] PetscCommDuplicate(): returning tag 2147483642<br></div>
[1] PetscCommDuplicate(): returning tag 2147483642<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483637<br></div>[1] PetscCommDuplicate(): returning tag 2147483637<div class="im"><br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
</div>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483632<br>[1] PetscCommDuplicate(): returning tag 2147483632<div class="im"><br>[0] MatSetUpPreallocation(): Warning not preallocating matrix storage<br>
</div>
[0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483647<br></div>[1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647<br>
[1] PetscCommDuplicate(): returning tag 2147483647<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483646<br></div>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>
[1] PetscCommDuplicate(): returning tag 2147483646<br>[0] MatStashScatterBegin_Private(): No of messages: 0 <br>[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used<div class="im">
<br>
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br></div>[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used<br>
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines<br>
[0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483645<br>
</div>
[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>[1] PetscCommDuplicate(): returning tag 2147483645<br>[1] PetscCommDuplicate(): returning tag 2147483628<br>
[0] PetscCommDuplicate(): returning tag 2147483628<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483644<br></div>
[0] PetscCommDuplicate(): returning tag 2147483627<br>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>[1] PetscCommDuplicate(): returning tag 2147483644<br>[1] PetscCommDuplicate(): returning tag 2147483627<br>[1] PetscCommDuplicate(): returning tag 2147483622<br>
[0] PetscCommDuplicate(): returning tag 2147483622<br>[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[0] VecScatterCreate(): General case: MPI to Seq<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used<div class="im">
<br>
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br></div>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1<br>[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.<div class="im">
<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br></div>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1<br>[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
[1] PetscCommDuplicate(): returning tag 2147483618<br>[0] PetscCommDuplicate(): returning tag 2147483618<br>[0] PetscCommDuplicate(): returning tag 2147483617<br>[1] PetscCommDuplicate(): returning tag 2147483617<br>
[1] PetscCommDuplicate(): returning tag 2147483616<br>[0] PetscCommDuplicate(): returning tag 2147483616<br>[0] PetscCommDuplicate(): returning tag 2147483611<br>[1] PetscCommDuplicate(): returning tag 2147483611<br>
[1] PetscCommDuplicate(): returning tag 2147483606<br>[0] PetscCommDuplicate(): returning tag 2147483606<br>[0] PetscCommDuplicate(): returning tag 2147483601<br>[1] PetscCommDuplicate(): returning tag 2147483601<br>
[1] PetscCommDuplicate(): returning tag 2147483596<br>[0] PetscCommDuplicate(): returning tag 2147483596<div class="im"><br>[0] PCSetUp(): Setting up new PC<br></div>[1] PetscCommDuplicate(): returning tag 2147483591<br>
[0] PetscCommDuplicate(): returning tag 2147483591<br>
[0] PetscCommDuplicate(): returning tag 2147483586<br>[1] PetscCommDuplicate(): returning tag 2147483586<br>[0] PetscCommDuplicate(): returning tag 2147483581<br>[1] PetscCommDuplicate(): returning tag 2147483581<br>
[0] PetscCommDuplicate(): returning tag 2147483576<br>[1] PetscCommDuplicate(): returning tag 2147483576<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483571<br></div>[1] PetscCommDuplicate(): returning tag 2147483571<br>
[1] PetscCommDuplicate(): returning tag 2147483566<br>[0] PetscCommDuplicate(): returning tag 2147483566<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483561<br></div>[1] PetscCommDuplicate(): returning tag 2147483561<br>
[0] PetscCommDuplicate(): returning tag 2147483556<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483551<br></div><div class="im">[0] PetscCommDuplicate(): returning tag 2147483546<br></div>[0] PetscCommDuplicate(): returning tag 2147483541<br>
[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5<div class="im"><br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
</div>
[0] PetscCommDuplicate(): returning tag 2147483536<div class="im"><br>KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> GMRES: happy breakdown tolerance 1e-30<br>
maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>PC Object:<br> type: jacobi<br>
linear system matrix = precond matrix:<br> Matrix Object:<br></div> type=mpiaij, rows=10, cols=10<br> tot[1] PetscCommDuplicate(): returning tag 2147483556<br>[1] PetscCommDuplicate(): returning tag 2147483551<br>
[1] PetscCommDuplicate(): returning tag 2147483546<br>
[1] PetscCommDuplicate(): returning tag 2147483541<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[1] PetscCommDuplicate(): returning tag 2147483536<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689<br>
[1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783<br>[1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783<br>
al: nonzeros=28, allocated nonzeros=70<br> not using I-node (on process 0) routines<div class="im"><br>Norm of error < 1.e-12, Iterations 5<br></div>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689<br>
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783<br>
[0] PetscFinalize(): PetscFinalize() called<br>[1] PetscFinalize(): PetscFinalize() called<br>[1] PetscCommDuplicate(): returning tag 2147483535<br>[1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>
[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>[1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483535<div class="im">
<br>[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br></div>[1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>
[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>[1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<div class="im"><br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br>
</div><div class="im">
gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br>
gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br>
</div>
gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ clear<br><br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpich2version -n 2 ./ex23 -info<br>
Unrecognized argument -n<br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info<br>[0] PetscInitialize(): PETSc successfully started: number of processors = 2<br>
[1] PetscInitialize(): PETSc successfully started: number of processors = 2<div class="im"><br>[0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[0] PetscInitialize(): Running on machine: gaurish108-laptop<br>
</div>
[1] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[1] PetscInitialize(): Running on machine: gaurish108-laptop<div class="im"><br>[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>
</div>
[1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>[1] PetscCommDuplicate(): returning tag 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483647<div class="im">
<br>[0] PetscCommDuplicate(): returning tag 2147483642<br></div><div class="im">
[0] PetscCommDuplicate(): returning tag 2147483637<br></div>[1] PetscCommDuplicate(): returning tag 2147483642<br>[1] PetscCommDuplicate(): returning tag 2147483637<div class="im"><br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
</div>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[1] PetscCommDuplicate(): returning tag 2147483632<br>[0] PetscCommDuplicate(): returning tag 2147483632<div class="im"><br>[0] MatSetUpPreallocation(): Warning not preallocating matrix storage<br>
</div>
[0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483647<br></div>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im">
<br>
[0] PetscCommDuplicate(): returning tag 2147483646<br></div>[1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647<br>[1] PetscCommDuplicate(): returning tag 2147483647<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>
[1] PetscCommDuplicate(): returning tag 2147483646<br>[0] MatStashScatterBegin_Private(): No of messages: 0 <br>[0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used<div class="im">
<br>
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br></div>[0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483645<br></div>[0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter<div class="im">
<br>[0] PetscCommDuplicate(): returning tag 2147483628<br></div>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483644<br></div><div class="im">[0] PetscCommDuplicate(): returning tag 2147483627<br>
</div>[0] PetscCommDuplicate(): returning tag 2147483622<br>
[1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>
[1] PetscCommDuplicate(): returning tag 2147483645<br>[1] PetscCommDuplicate(): returning tag 2147483628<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783<br>[1] PetscCommDuplicate(): returning tag 2147483644<br>
[1] PetscCommDuplicate(): returning tag 2147483627<br>[1] PetscCommDuplicate(): returning tag 2147483622<br>[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>[0] VecScatterCreate(): General case: MPI to Seq<br>
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used<div class="im"><br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br></div>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1<br>
[0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.<br>[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used<br>[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1<br>[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines.<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
[1] PetscCommDuplicate(): returning tag 2147483618<div class="im"><br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br></div>[0] PetscCommDuplicate(): returning tag 2147483618<br>
[0] PetscCommDuplicate(): returning tag 2147483617<br>
[0] PetscCommDuplicate(): returning tag 2147483616<br>[1] PetscCommDuplicate(): returning tag 2147483617<br>[1] PetscCommDuplicate(): returning tag 2147483616<br>[1] PetscCommDuplicate(): returning tag 2147483611<br>
[1] PetscCommDuplicate(): returning tag 2147483606<br>[0] PetscCommDuplicate(): returning tag 2147483611<br>[0] PetscCommDuplicate(): returning tag 2147483606<br>[0] PetscCommDuplicate(): returning tag 2147483601<br>
[0] PetscCommDuplicate(): returning tag 2147483596<div class="im"><br>[0] PCSetUp(): Setting up new PC<br></div>[1] PetscCommDuplicate(): returning tag 2147483601<br>[1] PetscCommDuplicate(): returning tag 2147483596<br>
[1] PetscCommDuplicate(): returning tag 2147483591<br>
[0] PetscCommDuplicate(): returning tag 2147483591<br>[0] PetscCommDuplicate(): returning tag 2147483586<br>[0] PetscCommDuplicate(): returning tag 2147483581<br>[0] PetscCommDuplicate(): returning tag 2147483576<br>
[1] PetscCommDuplicate(): returning tag 2147483586<br>[1] PetscCommDuplicate(): returning tag 2147483581<br>[1] PetscCommDuplicate(): returning tag 2147483576<br>[1] PetscCommDuplicate(): returning tag 2147483571<br>
[0] PetscCommDuplicate(): returning tag 2147483571<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483566<br></div><div class="im">[0] PetscCommDuplicate(): returning tag 2147483561<br></div>[1] PetscCommDuplicate(): returning tag 2147483566<br>
[1] PetscCommDuplicate(): returning tag 2147483561<br>[1] PetscCommDuplicate(): returning tag 2147483556<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483556<br></div><div class="im">[0] PetscCommDuplicate(): returning tag 2147483551<br>
</div>
[1] PetscCommDuplicate(): returning tag 2147483551<br>[1] PetscCommDuplicate(): returning tag 2147483546<div class="im"><br>[0] PetscCommDuplicate(): returning tag 2147483546<br></div>[0] PetscCommDuplicate(): returning tag 2147483541<br>
[1] PetscCommDuplicate(): returning tag 2147483541<br>[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5<div class="im">
<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br></div>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483536<div class="im">
<br>
KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br>
Matrix Object:<br></div> type=mpiaij, rows=10, cols=10<br>[1] PetscCommDuplicate(): returning tag 2147483536<br> total: nonzeros=28, allocated nonzeros=70<br> not using I-node (on process 0) routines<div class="im">
<br>Norm of error < 1.e-12, Iterations 5<br></div>
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689<br>[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783<br>
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783<br>[0] PetscFinalize(): PetscFinalize() called<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689<br>
[1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783<br>[1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783<br>
[1] PetscFinalize(): PetscFinalize() called<br>[1] PetscCommDuplicate(): returning tag 2147483535<br>[1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>
[1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>[1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>
[1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483535<div class="im"><br>[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>
[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br></div><div><div></div><div class="h5"><div class="gmail_quote">On Mon, Jan 17, 2011 at 6:20 PM, Gaurish Telang <span dir="ltr"><<a href="mailto:gaurish108@gmail.com" target="_blank">gaurish108@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">This is what I get on running mpiexec -n 2 ./ex23 -info<br><br>Also, using mpirun in place of mpiexec and using the -info option I get the exact same output you see below. <br>
<br>As far as the MPI implmentation I am using, I have OpenMPI and MPICH installed on my laptop. <br><br>While installing PETSc there were some external packages required. In the external packages folder I can see the following softwares:<br>
<br>fblaslapack-3.1.1 mpich2-1.0.8 ParMetis-dev-p3 SuperLU_DIST_2.4-hg-v2<br><br>Possibly it is this mpich2 that should be used?? <br>Please let me know what I should do. I am quite new to PETSc. <br><br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 2 ./ex23 -info<br>
[0] PetscInitialize(): PETSc successfully started: number of processors = 1<br>[0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[0] PetscInitialize(): Running on machine: gaurish108-laptop<br>
[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483646<br>[0] PetscCommDuplicate(): returning tag 2147483645<br>
[0] PetscInitialize(): PETSc successfully started: number of processors = 1<br>[0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none)<br>[0] PetscInitialize(): Running on machine: gaurish108-laptop<br>
[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483647<br>[0] PetscCommDuplicate(): returning tag 2147483646<br>[0] PetscCommDuplicate(): returning tag 2147483645<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483644<br>[0] MatSetUpPreallocation(): Warning not preallocating matrix storage<br>[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 unneeded,28 used<br>
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483643<br>[0] PetscCommDuplicate(): returning tag 2147483642<br>[0] PetscCommDuplicate(): returning tag 2147483641<br>
[0] PetscCommDuplicate(): returning tag 2147483640<br>[0] PetscCommDuplicate(): returning tag 2147483639<br>[0] PetscCommDuplicate(): returning tag 2147483638<br>[0] PetscCommDuplicate(): returning tag 2147483637<br>
[0] PCSetUp(): Setting up new PC<br>[0] PetscCommDuplicate(): returning tag 2147483636<br>[0] PetscCommDuplicate(): returning tag 2147483635<br>[0] PetscCommDuplicate(): returning tag 2147483634<br>[0] PetscCommDuplicate(): returning tag 2147483633<br>
[0] PetscCommDuplicate(): returning tag 2147483632<br>[0] PetscCommDuplicate(): returning tag 2147483631<br>[0] PetscCommDuplicate(): returning tag 2147483630<br>[0] PetscCommDuplicate(): returning tag 2147483629<br>
[0] PetscCommDuplicate(): returning tag 2147483628<br>[0] PetscCommDuplicate(): returning tag 2147483627<br>[0] PetscCommDuplicate(): returning tag 2147483626<br>[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483625<div><br>KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>
PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br> Matrix Object:<br> type=seqaij, rows=10, cols=10<br> total: nonzeros=28, allocated nonzeros=50<br> not using I-node routines<br>Norm of error < 1.e-12, Iterations 5<br>
</div>
[0] PetscFinalize(): PetscFinalize() called<br>[0] PetscCommDuplicate(): returning tag 2147483624<br>[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483644<br>[0] MatSetUpPreallocation(): Warning not preallocating matrix storage<br>
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 unneeded,28 used<br>[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0<br>[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>
[0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483643<br>
[0] PetscCommDuplicate(): returning tag 2147483642<br>[0] PetscCommDuplicate(): returning tag 2147483641<br>[0] PetscCommDuplicate(): returning tag 2147483640<br>[0] PetscCommDuplicate(): returning tag 2147483639<br>
[0] PetscCommDuplicate(): returning tag 2147483638<br>[0] PetscCommDuplicate(): returning tag 2147483637<br>[0] PCSetUp(): Setting up new PC<br>[0] PetscCommDuplicate(): returning tag 2147483636<br>[0] PetscCommDuplicate(): returning tag 2147483635<br>
[0] PetscCommDuplicate(): returning tag 2147483634<br>[0] PetscCommDuplicate(): returning tag 2147483633<br>[0] PetscCommDuplicate(): returning tag 2147483632<br>[0] PetscCommDuplicate(): returning tag 2147483631<br>
[0] PetscCommDuplicate(): returning tag 2147483630<br>[0] PetscCommDuplicate(): returning tag 2147483629<br>[0] PetscCommDuplicate(): returning tag 2147483628<br>[0] PetscCommDuplicate(): returning tag 2147483627<br>
[0] PetscCommDuplicate(): returning tag 2147483626<br>[0] KSPDefaultConverged(): Linear solver has converged. Residual norm 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>[0] PetscCommDuplicate(): returning tag 2147483625<div><br>KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>
PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br> Matrix Object:<br> type=seqaij, rows=10, cols=10<br> total: nonzeros=28, allocated nonzeros=50<br> not using I-node routines<br>Norm of error < 1.e-12, Iterations 5<br>
</div>
[0] PetscFinalize(): PetscFinalize() called<br>[0] PetscCommDuplicate(): returning tag 2147483624<br>[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688<br>
[0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784<br>[0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784<br>[0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784<br>
[0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784<div><br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br><br></div><div>
<div></div><div><div class="gmail_quote">On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang <span dir="ltr"><<a href="mailto:gaurish108@gmail.com" target="_blank">gaurish108@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">Hi.<br><br>I had two questions<br><br>(1) <br><br>I was curious to know why the following happens with the PETSc standard output. Having created the executable 'test' when I try to run it with mpiexec -n 2 ./test <br>
the same output is printed to the terminal twice. If I use 3 processors, then the same output is printed thrice.<br><br>In short the number of processors = number of times the output from PETSc is printed. Could this be a mistake with my PETSc installation???<br>
<br>For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c After creating ex23 the executable and running it with two processors gives the following terminal output:<br><br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 1 ./ex23<br>
KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br>
Matrix Object:<br> type=seqaij, rows=10, cols=10<br> total: nonzeros=28, allocated nonzeros=50<br> not using I-node routines<br>Norm of error < 1.e-12, Iterations 5<br>gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 2 ./ex23<br>
KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br>
Matrix Object:<br> type=seqaij, rows=10, cols=10<br> total: nonzeros=28, allocated nonzeros=50<br> not using I-node routines<br>Norm of error < 1.e-12, Iterations 5<br>KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-07, absolute=1e-50, divergence=10000<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>
PC Object:<br> type: jacobi<br> linear system matrix = precond matrix:<br> Matrix Object:<br> type=seqaij, rows=10, cols=10<br> total: nonzeros=28, allocated nonzeros=50<br> not using I-node routines<br>Norm of error < 1.e-12, Iterations 5<br>
gaurish108@gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ <br><br><br><br>(2) <br><br>Also I was told yesterday on the PETSC users mailing list that the MATLAB m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary format. <br>
The following are the comments in the code near the heading saying that it works only for square sparse matrices . But it seems to be working quite well for rectangular sparse MATLAB matrices also. <br>I have tested this in conjunction with PetscBinaryRead.m also, which reads in a Petsc binary file into MATLAB as a sparse matrix. <br>
<br>Is there something I might have missed or some error that I might be making??? <br><br>Comments in PetscBinaryWrite.m<br>"-================================================<br>% Writes in PETSc binary file sparse matrices and vectors<br>
% if the array is multidimensional and dense it is saved<br>% as a one dimensional array<br>%<br>% Only works for square sparse matrices <br>%:<br>..<br>..<br>..<br>..<br>..<br>..<br>.<br>.<br>.<br><br><br><br> <br>
</blockquote></div><br>
</div></div></blockquote></div><br>
</div></div></blockquote></div><br>