[petsc-users] Petsc cannot be initialized on vesta in some --mode options

Roc Wang pengxwang at hotmail.com
Wed Jan 22 12:39:36 CST 2014



> From: jed at jedbrown.org
> To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov
> CC: jeff.science at gmail.com
> Subject: RE: [petsc-users] Petsc cannot be initialized on vesta in some --mode options
> Date: Wed, 22 Jan 2014 11:24:21 -0700
> 
> Roc Wang <pengxwang at hotmail.com> writes:
> > The configure.log is attached.
> > I also tried running the program with command like:
> > qsub -n <number of nodes> -t 10 --mode <ranks per node> --env "F00=a:BAR=b:PAMID_COLLECTIVES=0" ./x.r 
> >
> > For 1024 as the total number of ranks, the program was able to run in
> > c1, c16, c32, and c64 mode, but still petsc cannot be initialized in
> > c2, c4, and c8 mode. The runtime log files for c8 mode were
> > attached. Thanks.
> 
> Hmm, appears to be this line erroring.
> 
>       ierr     = MPI_Bcast(&len,1,MPIU_SIZE_T,0,PETSC_COMM_WORLD);CHKERRQ(ierr);
> 
> But at least some processes are stuck in an earlier
> PetscOptionsInsertFile (which contains MPI_Bcast).  This is like the
> problem we have seen before, but it used to go away with
> PAMID_COLLECTIVES=0.  Can you run with -skip_petscrc?  That will skip a
> couple MPI_Bcasts and might get it working again.
> 
> 
> Jeff, has the Vesta stack changed recently?  Any reason why
> PAMID_COLLECTIVES=0 would not be taking effect here?
> 
> I'll try to reproduce Roc's findings later today.

Yeah. The program works with -skip_petscrc. I am trying to run the full PETSc solver and take a look what will happen.

I just recalled there was an error caused by calling  MPI_Bcast() when I transferred the program from other machine to vesta. At that time I just replace the MPI_Bcast(). I am not sure whether it is the same problem as what we have here. But just for your information.
 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140122/29d7e7e5/attachment.html>


More information about the petsc-users mailing list