[petsc-dev] Bad scaling of GAMG in FieldSplit

Jed Brown jed at jedbrown.org
Fri Jul 27 10:48:36 CDT 2018


Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:

>> On 27 Jul 2018, at 5:33 PM, Jed Brown <jed at jedbrown.org> wrote:
>> 
>> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>> 
>>>> On 27 Jul 2018, at 5:12 PM, Jed Brown <jed at jedbrown.org> wrote:
>>>> 
>>>> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>>>> 
>>>>> Everything is fine with GAMG I think, please find the (trimmed) -eps_view attached. The problem is that, correct me if I’m wrong, there is no easy way to redistribute data efficiently from within PETSc when using fieldsplit with unbalanced number of unknowns per field. For the other three fields, the solvers are still behaving somehow properly. Now if I’d like to optimize this some more, I’d probably need to switch from a fieldsplit to a MatNest, with submatrices from different communicators, so that I don’t have all processes handling the pressure space. But this is apparently not allowed.
>>>> 
>>>> What if pressure is still on a global communicator, but all the degrees
>>>> of freedom are in a subset?  Then MatMult and the like have nothing to
>>>> send or receive on the processes without any dofs.  Since there are no
>>>> reductions in PCApply (there are in setup), it should complete
>>>> immediately for all the processes that don't have any dofs, right?
>>> 
>>> My PC is PCKSP, with GMRES underneath, so there are reductions on the global communicator in PCApply.
>> 
>> Why PCKSP in the pressure solve?
>
> I need to solve (on the pressure space) Ax = b with A^-1 = B^-1+ C^-1.
> I do an additive PCCOMPOSITE, with two PCKSP. B is a mass matrix (PCJACOBI, no problem), C is a shifted Laplacian (PCGAMG, problem…).

You need C^{-1} to be applied accurately, not just a V-cycle?

We could add support for subcommunicators when dof partitions are
possible, but it make profiling more difficult to interpret and may have
unexpected consequences if not done in a tightly controlled way.


Barry, should we change PCCompositeAddPC to use a KSP (or otherwise) so
that KSPSolve is called like all other composed solvers?


More information about the petsc-dev mailing list