[petsc-users] computations on split mpi comm
Barry Smith
bsmith at mcs.anl.gov
Thu Mar 10 19:10:16 CST 2016
> On Mar 10, 2016, at 6:00 PM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>
> Hi,
>
> My interest is in running two separate KSP contexts on two subsets of the global mpi communicator context. Is there an example that demonstrates this?
No, but it is very simple. Create the two communicators with MPI_Comm_split() or some other mechanism and then create the matrix, vector, and solver objects for each communicator based on the sub communicator.
>
> My intention is for the two subsets to have an overlap. For example, on a 4 processor global communicator (0, 1, 2, 3), one subset could be {0} and the second {0, 1, 2, 3}, or perhaps {0, 1} and {1, 2, 3}.
This can only work if the two solves are not run at the same time since KSPSolve() etc are blocking. You could not start the second solve on process 0 until the first one is complete.
If you truly want to run them "at the same time" you need to use multiple threads on each process that shares the communicator (that is have two threads and each run with its sub communicator). Trying to do this is IMHO completely insane, better to use additional MPI processes and have no overlapping communicators.
Barry
>
> Any guidance would be appreciated.
>
> Thanks,
> Manav
>
>
More information about the petsc-users
mailing list