[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

Kong, Fande fande.kong at inl.gov
Tue Apr 3 15:34:34 CDT 2018


On Tue, Apr 3, 2018 at 2:12 PM, Rob Falgout hypre Tracker <
hypre-support at llnl.gov> wrote:

>
> Rob Falgout <rfalgout at llnl.gov> added the comment:
>
> Hi Barry,
>
> Can you explain the scenario where multiple hypre solves would be working
> in parallel with the same communicator?  Thanks!
>

Hi Rob,

For instance, if an application has multiple fields and each field will be
solved by a HYPRE solver,  then we end up with  multiple hypre solves. The
multiple hypre solves are not run simultaneously, instead, it will be run
one-by-one. If we assign the same communicator to the multiple hypre
solves, any potential tag conflict?

Fande,



>
> -Rob
>
> ____________________________________________
> hypre Issue Tracker <hypre-support at llnl.gov>
> <https://urldefense.proofpoint.com/v2/url?u=http-
> 3A__cascb1.llnl.gov_hypre_issue1595&d=DwIFaQ&c=
> 54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=DUUt3SRGI0_
> JgtNaS3udV68GRkgV4ts7XKfj2opmiCY&m=3KsQjTxI8eRWrSIJtB3o9NlBc4bEuu
> P4HxiT-BlMx3U&s=FjvfvXD21bX8LA2TAD-J6MWCJyMNm0aameRp_9nfLYs&e=>
> ____________________________________________
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180403/1be01c65/attachment-0001.html>


More information about the petsc-dev mailing list