PETSc communicator

Matt Funk mafunk at nmsu.edu
Tue Aug 22 15:46:04 CDT 2006


Hi Barry,

thanks for the clarification. I am running my code on a different (much 
slower) machine right now, and from the initial results it seems so far that 
Matt Knepley's suspicion of having a bad network could be correct. But i need 
to do a couple more runs. 

thanks
mat

On Tuesday 22 August 2006 13:34, Barry Smith wrote:
>   Mat,
>
>     This will not effect load balance or anything like that.
>
> When you pass a communicator like  MPI_COMM_WORLD to PETSc
> we don't actually use that communicator (because you might
> be using it and there may be tag collisions etc). So instead
> we store our own communicator inside the MPI_COMM_WORLD as an
> attribute, this message is just telling us we are accessing
> the inner communicator.
>
>    Barry
>
> On Thu, 17 Aug 2006, Matt Funk wrote:
> > Hi,
> >
> > i was wondering what the message:
> > 'PetscCommDuplicate Using internal PETSc communicator 92 170'
> > means exactly. I still have issues with PETSc when running 1 vs 2 procs
> > w.r.t. the loadbalance.
> > However, when run on 2 vs 4 the balance seems to be almost perfect.
> > Then the option of a screwed up network was suggested to me, but since
> > the 4vs 2 proc case is ok, it seems not necessarily to be the case.
> >
> > Maybe somebody can tell me what it means?
> >
> > thanks
> > mat




More information about the petsc-users mailing list