[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

Rob Falgout hypre Tracker hypre-support at llnl.gov
Tue Apr 3 14:46:02 CDT 2018


Rob Falgout <rfalgout at llnl.gov> added the comment:

Hi Barry,

It looks like the only time we call MPI_Comm_create is to build a communicator for the coarsest grid solve using Gaussian elimination.  There are probably alternatives that do not require creating a sub-communicator.  Ulrike or someone else more familiar with the code should comment.

I don't see a need to do a Comm_dup() before calling hypre.

Hope this helps.

-Rob

----------
status: unread -> chatting

____________________________________________
hypre Issue Tracker <hypre-support at llnl.gov>
<http://cascb1.llnl.gov/hypre/issue1595>
____________________________________________


More information about the petsc-dev mailing list