[petsc-dev] Bad scaling of GAMG in FieldSplit

Jed Brown jed at jedbrown.org
Fri Jul 27 10:33:57 CDT 2018


Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:

>> On 27 Jul 2018, at 5:12 PM, Jed Brown <jed at jedbrown.org> wrote:
>> 
>> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>> 
>>> Everything is fine with GAMG I think, please find the (trimmed) -eps_view attached. The problem is that, correct me if I’m wrong, there is no easy way to redistribute data efficiently from within PETSc when using fieldsplit with unbalanced number of unknowns per field. For the other three fields, the solvers are still behaving somehow properly. Now if I’d like to optimize this some more, I’d probably need to switch from a fieldsplit to a MatNest, with submatrices from different communicators, so that I don’t have all processes handling the pressure space. But this is apparently not allowed.
>> 
>> What if pressure is still on a global communicator, but all the degrees
>> of freedom are in a subset?  Then MatMult and the like have nothing to
>> send or receive on the processes without any dofs.  Since there are no
>> reductions in PCApply (there are in setup), it should complete
>> immediately for all the processes that don't have any dofs, right?
>
> My PC is PCKSP, with GMRES underneath, so there are reductions on the global communicator in PCApply.

Why PCKSP in the pressure solve?


More information about the petsc-dev mailing list