[petsc-users] Using PetscBarrier or MPI_barrier in Fortran 90

Matthew Knepley knepley at gmail.com
Fri Aug 10 08:31:14 CDT 2012


On Fri, Aug 10, 2012 at 7:57 AM, Koki Imada <koki.imada at york.ac.uk> wrote:

> Hello,
>
> I'm fairly new to PETSc/MPI programming, and I'm stuck with the use of
> barriers. I have a code of the following form:
>
> 0. Initialise PETSc...
> 1. Processor 1 (or the "root") reports to user (using "print" statement).
> 2. "Root" does some work, while all other processors do nothing.
> 3. "Root" allocates array "A".
> 4. All processors does some work on array "A".
> 5. "Root" deallocates array "A".
> 6. PETSc is finalised.
>
> In order for this to work, I would like to put barriers between steps 3&4,
> and steps 4&5, and also between steps 2&3 (optional).
>
> Simply putting "call MPI_barrier(MPI_comm_world,**ierr)" between the
> steps doesn't do anything and I know this doesn't work due to the lack of
> communication between the processors (or otherwise), but I have never been
> able to find the "correct" way to implement the barrier...
>
> If somebody could show me how to correctly implement the barrier (either
> MPI or PETSc) between the steps as described above, I will be most grateful.
>

You really need to read
http://www.mcs.anl.gov/research/projects/mpi/usingmpi/. Different processes
do not have access to the same memory.

    Matt


> Best Regards,
>
> Koki
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120810/241dc075/attachment-0001.html>


More information about the petsc-users mailing list