[petsc-dev] Bad scaling of GAMG in FieldSplit

Mark Adams mfadams at lbl.gov
Fri Jul 27 10:15:27 CDT 2018


On Fri, Jul 27, 2018 at 11:12 AM Jed Brown <jed at jedbrown.org> wrote:

> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>
> > Everything is fine with GAMG I think, please find the (trimmed)
> -eps_view attached. The problem is that, correct me if I’m wrong, there is
> no easy way to redistribute data efficiently from within PETSc when using
> fieldsplit with unbalanced number of unknowns per field. For the other
> three fields, the solvers are still behaving somehow properly. Now if I’d
> like to optimize this some more, I’d probably need to switch from a
> fieldsplit to a MatNest, with submatrices from different communicators, so
> that I don’t have all processes handling the pressure space. But this is
> apparently not allowed.
>
> What if pressure is still on a global communicator, but all the degrees
> of freedom are in a subset?  Then MatMult and the like have nothing to
> send or receive on the processes without any dofs.  Since there are no
> reductions in PCApply (there are in setup), it should complete
> immediately for all the processes that don't have any dofs, right?
>

Yes, idle processes should just fall through.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180727/fcf3eb68/attachment.html>


More information about the petsc-dev mailing list