[petsc-users] handling multi physics applications on multiple MPI_Comm

Matthew Knepley knepley at gmail.com
Mon Jul 25 15:21:24 CDT 2016


On Mon, Jul 25, 2016 at 1:13 PM, Manav Bhatia <bhatiamanav at gmail.com> wrote:

> Hi,
>
>     I have a multi physics application with discipline1 defined on comm1
> and discipline2 on comm2.
>
>     My intent is to use the nested matrix for the KSP solver where each
> diagonal block is provided by the disciplines, and the off-diagonal blocks
> are defined as shell-matrices with matrix vector products.
>
>     I am a bit unclear about how to deal with the case of different set of
> processors on comm1 and comm2. I have the following questions and would
> appreciate some guidance:
>
> — Would it make sense to define a comm_global as a union of comm1 and
> comm2 for the MatCreateNest?
>
> — The diagonal blocks are available on comm1 and comm2 only. Should
> MatAssemblyBegin/End for these diagonal blocks be called on comm1 and comm2
> separately?
>
> — What comm should be used for the off-diagonal shell matrices?
>
> — Likewise, when calling VecGetSubVector and VecRestoreSubVector to get
> sub-vectors corresponding to discipline1 (or 2), what comm should these
> function calls be made?
>

I would first ask if you have a convincing reason for doing this, because
it sounds like the genesis of a million programming errors.

All the linear algebra objects would have to be in a global comm that
contained any subcomms you want to use. I don't
think it would make sense to define submatrices on subcomms. You can have
your assembly code run on a subcomm certainly,
but again this is a tricky business and I find it hard to understand the
gain.

   Matt


> Thanks,
> Manav
>
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160725/f065b1d0/attachment-0001.html>


More information about the petsc-users mailing list