Hi Barry,<div><br></div><div>It works now. Thank you so much for the help. </div><div><br></div><div>Thanks,</div><div>George</div><div><br><br><div class="gmail_quote">On Fri, Aug 31, 2012 at 7:57 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Ok, another bug. Put the attached file in src/mat/impls/aij/mpi and run make in that directory then relink program and run again.<br>
<br>
Barry<br>
<br>
[see attached file: mpiaij.c]<br>
<div class="HOEnZb"><div class="h5"><br>
On Aug 31, 2012, at 5:26 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
<br>
> hi Barry,<br>
><br>
> The hmpi option is read in properly now. The error is now different when I use -hmpi_spawn_size 3 with mpiexec -n 1. My printout suggests this is now happening in the KSPSolve.<br>
><br>
> George<br>
><br>
><br>
> [0] petscinitialize_(): (Fortran):PETSc successfully started: procs 1<br>
> [0] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> [0] petscinitialize_(): Running on machine: gilbert<br>
> [0] petscinitialize_(): (Fortran):PETSc successfully started: procs 2<br>
> [0] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> [0] petscinitialize_(): Running on machine: gilbert<br>
> [1] petscinitialize_(): (Fortran):PETSc successfully started: procs 2<br>
> [1] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> [1] petscinitialize_(): Running on machine: gilbert<br>
> [0] PetscHMPISpawn(): PETSc HMPI successfully spawned: number of nodes = 1 node size = 3<br>
><br>
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374781 max tags = 2147483647<br>
> [0] MatSetUp(): Warning not preallocating matrix storage<br>
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 360 X 360; storage space: 3978 unneeded,3222 used<br>
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 360<br>
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>
> [0] Mat_CheckInode(): Found 120 nodes of 360. Limit used: 5. Using Inode routines<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374781<br>
> start ksp<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374781<br>
> [0] PCSetUp(): Setting up new PC<br>
> [0] PetscCommDuplicate(): Duplicating a communicator -2080374782 -2080374780 max tags = 2147483647<br>
> [0] PetscCommDuplicate(): Duplicating a communicator -1006632960 -1006632959 max tags = 2147483647<br>
> [1] PetscCommDuplicate(): Duplicating a communicator -2080374779 -2080374778 max tags = 2147483647<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator -2080374782 -2080374780<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator -1006632960 -1006632959<br>
> [1] PetscCommDuplicate(): Using internal PETSc communicator -2080374779 -2080374778<br>
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374779 max tags = 2147483647<br>
> [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -1006632958 max tags = 2147483647<br>
> [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374777 max tags = 2147483647<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374779<br>
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -1006632958<br>
> [0] VecScatterCreate(): Special case: processor zero gets entire parallel vector, rest get none<br>
> [0] Petsc_DelComm(): Removing reference to PETSc communicator imbedded in a user MPI_Comm m -2080374779<br>
> [0] Petsc_DelComm(): User MPI_Comm m 1140850689 is being freed, removing reference from inner PETSc comm to this outer comm<br>
> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374779<br>
> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374779<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator -2080374782 -2080374780<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator -1006632960 -1006632959<br>
> [1] PetscCommDuplicate(): Using internal PETSc communicator -2080374779 -2080374778<br>
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -1006632958<br>
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
><br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: [1]PETSC ERROR: ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [1]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC</a> ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC</a> ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [1]PETSC ERROR: likely location of problem given in stack below<br>
> [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>
> [0]PETSC ERROR: likely location of problem given in stack below<br>
> [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> [1]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> [1]PETSC ERROR: is given.<br>
> [1]PETSC ERROR: [1] MatDistribute_MPIAIJ line 192 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/mat/impls/aij/mpi/mpiaij.c<br>
> [1]PETSC ERROR: [1] PCSetUp_HMPI_MP line 90 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/impls/openmp/hpc.c<br>
> [1]PETSC ERROR: [1] PetscHMPIHandle line 253 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/sys/objects/mpinit.c<br>
> [1]PETSC ERROR: [1] PetscHMPISpawn line 71 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/sys/objects/mpinit.c<br>
> [1]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
> [1]PETSC ERROR: Signal received!<br>
> [1]PETSC ERROR: ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012<br>
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
> [1]PETSC ERROR: See docs/index.html for manual pages.<br>
> [1]PETSC ERROR: ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: ../../esd-tough2/xt2_eos4 on a arch-linu named gilbert by gpau Fri Aug 31 15:20:27 2012<br>
> [1]PETSC ERROR: [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> [0]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> [0]PETSC ERROR: is given.<br>
> Libraries linked from /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/lib<br>
> [1]PETSC ERROR: Configure run at Fri Aug 31 15:16:04 2012<br>
> [1]PETSC ERROR: Configure options --with-debugging=1 --with-mpi-dir=/usr/lib/mpich2 --download-hypre=1 --prefix=/home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib<br>
> [1]PETSC ERROR: ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1<br>
> [0]PETSC ERROR: [0] MatDistribute_MPIAIJ line 192 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/mat/impls/aij/mpi/mpiaij.c<br>
> [0]PETSC ERROR: [0] PCSetUp_HMPI_MP line 90 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/impls/openmp/hpc.c<br>
> [0]PETSC ERROR: [0] PetscHMPIHandle line 253 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/sys/objects/mpinit.c<br>
> [0]PETSC ERROR: [0] PetscHMPISpawn line 71 /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/sys/objects/mpinit.c<br>
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
> [0]PETSC ERROR: Signal received!<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012<br>
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
> [0]PETSC ERROR: See docs/index.html for manual pages.<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: ../../esd-tough2/xt2_eos4 on a arch-linu named gilbert by gpau Fri Aug 31 15:20:27 2012<br>
> [0]PETSC ERROR: Libraries linked from /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/lib<br>
> [0]PETSC ERROR: Configure run at Fri Aug 31 15:16:04 2012<br>
> [0]PETSC ERROR: Configure options --with-debugging=1 --with-mpi-dir=/usr/lib/mpich2 --download-hypre=1 --prefix=/home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> Fatal error in MPI_Allreduce: Other MPI error, error stack:<br>
> MPI_Allreduce(855)........: MPI_Allreduce(sbuf=0x7fff6315ead0, rbuf=0x7fff6315eae0, count=2, MPI_INT, MPI_MAX, comm=0x84000004) failed<br>
> MPIR_Allreduce_impl(712)..:<br>
> MPIR_Allreduce_intra(534).:<br>
> dequeue_and_set_error(596): Communication error with rank 0<br>
><br>
><br>
> On Fri, Aug 31, 2012 at 2:09 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> [see attached file: zstart.c]<br>
><br>
> On Aug 31, 2012, at 4:07 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
><br>
> > Hi Barry,<br>
> ><br>
> > You forgot the file ...<br>
> ><br>
> > George<br>
> ><br>
> > On Fri, Aug 31, 2012 at 2:04 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> ><br>
> > Yikes. It is totally my fault. The handling of these merge and spawn options is done only for PetscInitialize() for C. Not for Fortran, hence the arguments just got ignored.<br>
> ><br>
> > Please find attached a file zstart.c put it in the directory src/sys/ftn-custom and run make in that directory (with appropriate PETSC_DIR and PETSC_ARCH set)<br>
> ><br>
> > Then link and run the example again.<br>
> ><br>
> ><br>
> > Barry<br>
> ><br>
> ><br>
> > On Aug 31, 2012, at 3:30 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
> ><br>
> > > Sorry, it was a cut and paste error. I tried running the code with all the options in the command line:<br>
> > ><br>
> > > mpiexec.mpich2 -n 1 xt2_eos4 -hmpi_spawn_size 3 -pc_type hmpi -ksp_type preonly -hmpi_ksp_type cg -hmpi_pc_type hypre -hmpi_pc_hypre boomeramg<br>
> > ><br>
> > > mpiexec.mpich2 -n 2 xt2_eos4 -hmpi_merge_size 2 -pc_type hmpi -ksp_type preonly -hmpi_ksp_type cg -hmpi_pc_type hypre -hmpi_pc_hypre boomeramg<br>
> > ><br>
> > > but I get the exact same outputs.<br>
> > ><br>
> > > George<br>
> > ><br>
> > ><br>
> > ><br>
> > > On Fri, Aug 31, 2012 at 1:18 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> > ><br>
> > > On Aug 31, 2012, at 3:09 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
> > ><br>
> > > > Hi Barry,<br>
> > > ><br>
> > > > For the hmpi_spawn_size, the options in my .petscrc are<br>
> > > > -info<br>
> > > > -pc_view<br>
> > > > pc_type hmpi<br>
> > ><br>
> > > How come there is no - in front of this one?<br>
> > ><br>
> > > > -ksp_type preonly<br>
> > > > -ksp_view<br>
> > > > -hmpi_pc_monitor<br>
> > > > -hmpi_ksp_monitor<br>
> > > > -hmpi_ksp_type cg<br>
> > > > -hmpi_pc_type hypre<br>
> > > > -hmpi_pc_hypre_type boomeramg<br>
> > > > -hmpi_spawn_size 3<br>
> > > ><br>
> > > > mpiexec.mpich2 -n 1 myprogram<br>
> > > ><br>
> > > > [0] petscinitialize_(): (Fortran):PETSc successfully started: procs 1<br>
> > > > [0] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> > > > [0] petscinitialize_(): Running on machine: gilbert<br>
> > > ><br>
> > > > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647<br>
> > > > [0] MatSetUp(): Warning not preallocating matrix storage<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 360 X 360; storage space: 3978 unneeded,3222 used<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 360<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>
> > > > [0] Mat_CheckInode(): Found 120 nodes of 360. Limit used: 5. Using Inode routines<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784<br>
> > > ><br>
> > > > Fatal error in PMPI_Bcast: Invalid communicator, error stack:<br>
> > > > PMPI_Bcast(1478): MPI_Bcast(buf=0x7fff30dacecc, count=1, MPI_INT, root=0, comm=0x0) failed<br>
> > > > PMPI_Bcast(1418): Invalid communicator<br>
> > > ><br>
> > > > I inserted some print statement between the ksp calls and found that the error occurs in<br>
> > > ><br>
> > > > call KSPSetFromOptions(ksp, pierr)<br>
> > > ><br>
> > > > 2. If I change hmpi_spawn_size 3 to hmpi_merge_size 2 and launch my job by<br>
> > ><br>
> > > How come there is no - in front of hmpi_merge_size 2?<br>
> > ><br>
> > ><br>
> > > Can you try putting all the arguments as command line arguments instead of in a file? It shouldn't matter but it seems like some of the arguments are being ignored.<br>
> > ><br>
> > > Barry<br>
> > ><br>
> > ><br>
> > > ><br>
> > > > mpiexec.mpich2 -n 2 myprogram<br>
> > > ><br>
> > > > [0] petscinitialize_(): (Fortran):PETSc successfully started: procs 2<br>
> > > > [0] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> > > > [0] petscinitialize_(): Running on machine: gilbert<br>
> > > > [1] petscinitialize_(): (Fortran):PETSc successfully started: procs 2<br>
> > > > [1] PetscGetHostName(): Rejecting domainname, likely is NIS gilbert.(none)<br>
> > > > [1] petscinitialize_(): Running on machine: gilbert<br>
> > > ><br>
> > > > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374780 max tags = 2147483647<br>
> > > > [0] MatSetUp(): Warning not preallocating matrix storage<br>
> > > > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374782 max tags = 2147483647<br>
> > > > [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374777 max tags = 2147483647<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
> > > > [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374780 max tags = 2147483647<br>
> > > > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
> > > > [0] MatStashScatterBegin_Private(): No of messages: 1<br>
> > > > [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 12896<br>
> > > > [0] MatAssemblyBegin_MPIAIJ(): Stash has 1611 entries, uses 0 mallocs.<br>
> > > > [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs.<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 180 X 180; storage space: 1998 unneeded,1602 used<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 180<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>
> > > > [0] Mat_CheckInode(): Found 60 nodes of 180. Limit used: 5. Using Inode routines<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 180 X 180; storage space: 1998 unneeded,1602 used<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 180<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 9<br>
> > > > [1] Mat_CheckInode(): Found 60 nodes of 180. Limit used: 5. Using Inode routines<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
> > > > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
> > > > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
> > > > [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter<br>
> > > > [0] VecScatterCreate(): General case: MPI to Seq<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 180 X 3; storage space: 396 unneeded,9 used<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 3<br>
> > > > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>
> > > > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374782<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 180 X 3; storage space: 396 unneeded,9 used<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 3<br>
> > > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374780<br>
> > > > [0] VecAssemblyBegin_MPI(): Stash has 180 entries, uses 1 mallocs.<br>
> > > > [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs.<br>
> > > > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374782<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374780<br>
> > > > [0] PCSetUp(): Setting up new PC<br>
> > > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374780<br>
> > > ><br>
> > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
> > > > [0]PETSC ERROR: Nonconforming object sizes!<br>
> > > > [0]PETSC ERROR: HMPI preconditioner only works for sequential solves!<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012<br>
> > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
> > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
> > > > [0]PETSC ERROR: See docs/index.html for manual pages.<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: ../../esd-tough2/xt2_eos4 on a arch-linu named gilbert by gpau Fri Aug 31 13:00:31 2012<br>
> > > > [0]PETSC ERROR: Libraries linked from /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/lib<br>
> > > > [0]PETSC ERROR: Configure run at Thu Aug 30 15:27:17 2012<br>
> > > > [0]PETSC ERROR: Configure options --with-debugging=0 --with-mpi-dir=/usr/lib/mpich2 --download-hypre=1 --prefix=/home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: PCCreate_HMPI() line 283 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/impls/openmp/hpc.c<br>
> > > > [0]PETSC ERROR: PCSetType() line 83 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/interface/pcset.c<br>
> > > > [0]PETSC ERROR: PCSetFromOptions() line 188 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/interface/pcset.c<br>
> > > > [0]PETSC ERROR: KSPSetFromOptions() line 287 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/ksp/interface/itcl.c<br>
> > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
> > > > [0]PETSC ERROR: No support for this operation for this object type!<br>
> > > > [0]PETSC ERROR: PC does not have apply!<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012<br>
> > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
> > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
> > > > [0]PETSC ERROR: See docs/index.html for manual pages.<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: ../../esd-tough2/xt2_eos4 on a arch-linu named gilbert by gpau Fri Aug 31 13:00:31 2012<br>
> > > > [0]PETSC ERROR: Libraries linked from /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/lib<br>
> > > > [0]PETSC ERROR: Configure run at Thu Aug 30 15:27:17 2012<br>
> > > > [0]PETSC ERROR: Configure options --with-debugging=0 --with-mpi-dir=/usr/lib/mpich2 --download-hypre=1 --prefix=/home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib<br>
> > > > [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> > > > [0]PETSC ERROR: PCApply() line 382 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/pc/interface/precon.c<br>
> > > > [0]PETSC ERROR: KSPInitialResidual() line 64 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/ksp/interface/itres.c<br>
> > > > [0]PETSC ERROR: KSPSolve_GMRES() line 230 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/ksp/impls/gmres/gmres.c<br>
> > > > [0]PETSC ERROR: KSPSolve() line 446 in /home/gpau/tough_codes/esd-tough2/build/Linux-x86_64-Debug-MPI-EOS4/toughlib/tpls/petsc/petsc-3.3-p3-source/src/ksp/ksp/interface/itfunc.c<br>
> > > ><br>
> > > > I note that the error appears to occur at the same point.<br>
> > > ><br>
> > > > George<br>
> > > ><br>
> > > ><br>
> > > > On Fri, Aug 31, 2012 at 11:31 AM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> > > ><br>
> > > > On Aug 31, 2012, at 1:27 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
> > > ><br>
> > > > > Hi Barry,<br>
> > > > ><br>
> > > > > 1. It is the exact same error related to MPI_ERR_COMM and MPI_Bcast.<br>
> > > ><br>
> > > > That should not happen. Please run and send all the output including the exact command line used<br>
> > > ><br>
> > > ><br>
> > > > > I am currently using the MPICH2 distribution provided by ubuntu but if MPICH version that Petsc download with -download-mpich works, I can use that.<br>
> > > > > 2. If I use hmpi_merge_size, I will need to launch mpiexec with more than 1 cpus. But, petsc will complain that the pctype hmpi can only be used in serial.<br>
> > > ><br>
> > > > That should not happen. Run with 2 MPI processes and -hmpi_merge_size 2 and send the complete error message.<br>
> > > ><br>
> > > ><br>
> > > > Barry<br>
> > > ><br>
> > > > ><br>
> > > > > George<br>
> > > > ><br>
> > > > ><br>
> > > > > On Aug 31, 2012, at 11:17 AM, Barry Smith wrote:<br>
> > > > ><br>
> > > > >><br>
> > > > >> On Aug 30, 2012, at 10:02 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
> > > > >><br>
> > > > >>> Hi Barry,<br>
> > > > >>><br>
> > > > >>> I tried with the addition of<br>
> > > > >>><br>
> > > > >>> -hmpi_spawn_size 3<br>
> > > > >>><br>
> > > > >>> but I am still getting the same error though.<br>
> > > > >><br>
> > > > >> The EXACT same error? Or some other error?<br>
> > > > >><br>
> > > > >> What happens if you run with the -hmpi_merge_size <size> option instead?<br>
> > > > >><br>
> > > > >> Barry<br>
> > > > >><br>
> > > > >> 1) I am getting a crash with the spawn version I suspect is due to bugs in the MPICH version I am using related to spawn.<br>
> > > > >><br>
> > > > >> 2) I am getting errors with the merge version due to Apple's ASLR which they make hard to turn off.<br>
> > > > >><br>
> > > > >><br>
> > > > >>> I am using mpich2. Any other options to try?<br>
> > > > >>><br>
> > > > >>> George<br>
> > > > >>><br>
> > > > >>><br>
> > > > >>> On Aug 30, 2012, at 7:28 PM, Barry Smith wrote:<br>
> > > > >>><br>
> > > > >>>><br>
> > > > >>>> On Aug 30, 2012, at 7:24 PM, George Pau <<a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a>> wrote:<br>
> > > > >>>><br>
> > > > >>>>> Hi,<br>
> > > > >>>>><br>
> > > > >>>>> I have some issues using the -pctype hmpi. I used the same setting found at<br>
> > > > >>>>><br>
> > > > >>>>> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCHMPI.html" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCHMPI.html</a><br>
> > > > >>>>><br>
> > > > >>>>> i.e.<br>
> > > > >>>>> -pc_type hmpi<br>
> > > > >>>>> -ksp_type preonly<br>
> > > > >>>>> -hmpi_ksp_type cg<br>
> > > > >>>>> -hmpi_pc_type hypre<br>
> > > > >>>>> -hmpi_pc_hypre_type boomeramg<br>
> > > > >>>>><br>
> > > > >>>>> My command is<br>
> > > > >>>>><br>
> > > > >>>>> mpiexec -n 1 myprogram<br>
> > > > >>>><br>
> > > > >>>> Sorry the documentation doesn't make this clearer. You need to start PETSc with special options to get the "worker" processes initialized. From the manual page for PCHMPI it has<br>
> > > > >>>><br>
> > > > >>>> See PetscHMPIMerge() and PetscHMPISpawn() for two ways to start up MPI for use with this preconditioner<br>
> > > > >>>><br>
> > > > >>>> This will tell you want option to start PETSc up with.<br>
> > > > >>>><br>
> > > > >>>> I will fix the PC so that it prints a far more useful error message.<br>
> > > > >>>><br>
> > > > >>>><br>
> > > > >>>><br>
> > > > >>>> Barry<br>
> > > > >>>><br>
> > > > >>>><br>
> > > > >>>>><br>
> > > > >>>>> But, I get<br>
> > > > >>>>><br>
> > > > >>>>> [gilbert:4041] *** An error occurred in MPI_Bcast<br>
> > > > >>>>> [gilbert:4041] *** on communicator MPI_COMM_WORLD<br>
> > > > >>>>> [gilbert:4041] *** MPI_ERR_COMM: invalid communicator<br>
> > > > >>>>> [gilbert:4041] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)<br>
> > > > >>>>><br>
> > > > >>>>> with openmpi. I get similar error with mpich2<br>
> > > > >>>>><br>
> > > > >>>>> Fatal error in PMPI_Bcast: Invalid communicator, error stack:<br>
> > > > >>>>> PMPI_Bcast(1478): MPI_Bcast(buf=0x7fffb683479c, count=1, MPI_INT, root=0, comm=0x0) failed<br>
> > > > >>>>> PMPI_Bcast(1418): Invalid communicator<br>
> > > > >>>>><br>
> > > > >>>>> I couldn't figure out what is wrong. My petsc is version 3.3.3 and the configuration is -with-debugging=0 --with-mpi-dir=/usr/lib/openmpi --download-hypre=1 and I am on a Ubuntu machine.<br>
> > > > >>>>><br>
> > > > >>>>> Note that with the default pc_type and ksp_type, everything is fine. It was also tested with multiple processors. I wondering whether there are some options that I am not specifying correctly?<br>
> > > > >>>>><br>
> > > > >>>>> --<br>
> > > > >>>>> George Pau<br>
> > > > >>>>> Earth Sciences Division<br>
> > > > >>>>> Lawrence Berkeley National Laboratory<br>
> > > > >>>>> One Cyclotron, MS 74-120<br>
> > > > >>>>> Berkeley, CA 94720<br>
> > > > >>>>><br>
> > > > >>>>> (510) 486-7196<br>
> > > > >>>>> <a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a><br>
> > > > >>>>> <a href="http://esd.lbl.gov/about/staff/georgepau/" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
> > > > >>>>><br>
> > > > >>>><br>
> > > > >>><br>
> > > > >><br>
> > > > ><br>
> > > ><br>
> > > ><br>
> > > ><br>
> > > ><br>
> > > > --<br>
> > > > George Pau<br>
> > > > Earth Sciences Division<br>
> > > > Lawrence Berkeley National Laboratory<br>
> > > > One Cyclotron, MS 74-120<br>
> > > > Berkeley, CA 94720<br>
> > > ><br>
> > > > <a href="tel:%28510%29%20486-7196" value="+15104867196">(510) 486-7196</a><br>
> > > > <a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a><br>
> > > > <a href="http://esd.lbl.gov/about/staff/georgepau/" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
> > > ><br>
> > ><br>
> > ><br>
> > ><br>
> > ><br>
> > > --<br>
> > > George Pau<br>
> > > Earth Sciences Division<br>
> > > Lawrence Berkeley National Laboratory<br>
> > > One Cyclotron, MS 74-120<br>
> > > Berkeley, CA 94720<br>
> > ><br>
> > > <a href="tel:%28510%29%20486-7196" value="+15104867196">(510) 486-7196</a><br>
> > > <a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a><br>
> > > <a href="http://esd.lbl.gov/about/staff/georgepau/" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
> > ><br>
> ><br>
> ><br>
> ><br>
> ><br>
> > --<br>
> > George Pau<br>
> > Earth Sciences Division<br>
> > Lawrence Berkeley National Laboratory<br>
> > One Cyclotron, MS 74-120<br>
> > Berkeley, CA 94720<br>
> ><br>
> > <a href="tel:%28510%29%20486-7196" value="+15104867196">(510) 486-7196</a><br>
> > <a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a><br>
> > <a href="http://esd.lbl.gov/about/staff/georgepau/" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
> ><br>
><br>
><br>
><br>
> --<br>
> George Pau<br>
> Earth Sciences Division<br>
> Lawrence Berkeley National Laboratory<br>
> One Cyclotron, MS 74-120<br>
> Berkeley, CA 94720<br>
><br>
> <a href="tel:%28510%29%20486-7196" value="+15104867196">(510) 486-7196</a><br>
> <a href="mailto:gpau@lbl.gov">gpau@lbl.gov</a><br>
> <a href="http://esd.lbl.gov/about/staff/georgepau/" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
><br>
</div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">George Pau</span><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">Earth Sciences Division</span><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">Lawrence Berkeley National Laboratory</span><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">One Cyclotron, MS 74-120</span><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">Berkeley, CA 94720</span><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">(510) 486-7196<br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<a href="mailto:gpau@lbl.gov" style="color:rgb(17,85,204);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)" target="_blank">gpau@lbl.gov</a><br style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)">
<a href="http://esd.lbl.gov/about/staff/georgepau/" style="color:rgb(17,85,204);font-family:arial,sans-serif;font-size:13px;background-color:rgb(255,255,255)" target="_blank">http://esd.lbl.gov/about/staff/georgepau/</a><br>
<br>
</div>