[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

Jed Brown jed at jedbrown.org
Tue Apr 3 17:37:49 CDT 2018


This is likely in the context of a preconditioner for a composite solve,
so the many BoomerAMG solvers will persist.  They all get set up and
then applied in sequence to the many scalar systems at each iteration of
an outer method.

If hypre doesn't want to dup its communicators internally, then PETSc
will dup a communicator for hypre, attach it to the outer communicator,
and look for it each time a new hypre object is created.  It's easy for
PETSc (we've done this for PETSc since the early days), but a headache
for other libraries.

"Li, Ruipeng" <li50 at llnl.gov> writes:

> Hi, Fande,
>
> "For instance, if an application has multiple fields and each field will be solved by a HYPRE solver,  then we end up with  multiple hypre solves. The multiple hypre solves are not run simultaneously, instead, it will be run one-by-one. If we assign the same communicator to the multiple hypre solves, any potential tag conflict?"
>
> In this scenario, is just the solve phase of BoomerAMG one after another, or also the setup of AMG? Since BoomerAMG takes the communicator from the A matrix, if the entire hypre solve (setup + solve) is serial, there should be no conflicts.
>
> -Ruipeng
>
> . -- .- .. .-.. / ..-. .-. --- -- / .-. ..- .. .--. . -. --. / .-.. ..
>  Ruipeng Li
>  Center for Applied Scientific Computing
>  Lawrence Livermore National Laboratory
>  P.O. Box 808, L-561
>  Livermore, CA 94551
>  phone - (925) 422-6037,  email - li50 at llnl.gov
>
>
> ________________________________________
> From: Kong, Fande <fande.kong at inl.gov>
> Sent: Tuesday, April 3, 2018 1:34 PM
> To: hypre-support
> Cc: Barry Smith; Derek Gaston; Li, Ruipeng; Osei-Kuffuor, Daniel; For users of the development version of PETSc; Schroder, Jacob B.; tzanio at llnl.gov; umyang at llnl.gov; Wang, Lu
> Subject: Re: [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose
>
>
>
> On Tue, Apr 3, 2018 at 2:12 PM, Rob Falgout hypre Tracker <hypre-support at llnl.gov<mailto:hypre-support at llnl.gov>> wrote:
>
> Rob Falgout <rfalgout at llnl.gov<mailto:rfalgout at llnl.gov>> added the comment:
>
> Hi Barry,
>
> Can you explain the scenario where multiple hypre solves would be working in parallel with the same communicator?  Thanks!
>
> Hi Rob,
>
> For instance, if an application has multiple fields and each field will be solved by a HYPRE solver,  then we end up with  multiple hypre solves. The multiple hypre solves are not run simultaneously, instead, it will be run one-by-one. If we assign the same communicator to the multiple hypre solves, any potential tag conflict?
>
> Fande,
>
>
>
> -Rob
>
> ____________________________________________
> hypre Issue Tracker <hypre-support at llnl.gov<mailto:hypre-support at llnl.gov>>
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__cascb1.llnl.gov_hypre_issue1595&d=DwIFaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmiCY&m=3KsQjTxI8eRWrSIJtB3o9NlBc4bEuuP4HxiT-BlMx3U&s=FjvfvXD21bX8LA2TAD-J6MWCJyMNm0aameRp_9nfLYs&e=>
> ____________________________________________


More information about the petsc-dev mailing list