[petsc-dev] Bad scaling of GAMG in FieldSplit

Pierre Jolivet pierre.jolivet at enseeiht.fr
Fri Jul 27 11:05:17 CDT 2018



> On 27 Jul 2018, at 5:48 PM, Jed Brown <jed at jedbrown.org> wrote:
> 
> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
> 
>>> On 27 Jul 2018, at 5:33 PM, Jed Brown <jed at jedbrown.org> wrote:
>>> 
>>> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>>> 
>>>>> On 27 Jul 2018, at 5:12 PM, Jed Brown <jed at jedbrown.org> wrote:
>>>>> 
>>>>> Pierre Jolivet <pierre.jolivet at enseeiht.fr> writes:
>>>>> 
>>>>>> Everything is fine with GAMG I think, please find the (trimmed) -eps_view attached. The problem is that, correct me if I’m wrong, there is no easy way to redistribute data efficiently from within PETSc when using fieldsplit with unbalanced number of unknowns per field. For the other three fields, the solvers are still behaving somehow properly. Now if I’d like to optimize this some more, I’d probably need to switch from a fieldsplit to a MatNest, with submatrices from different communicators, so that I don’t have all processes handling the pressure space. But this is apparently not allowed.
>>>>> 
>>>>> What if pressure is still on a global communicator, but all the degrees
>>>>> of freedom are in a subset?  Then MatMult and the like have nothing to
>>>>> send or receive on the processes without any dofs.  Since there are no
>>>>> reductions in PCApply (there are in setup), it should complete
>>>>> immediately for all the processes that don't have any dofs, right?
>>>> 
>>>> My PC is PCKSP, with GMRES underneath, so there are reductions on the global communicator in PCApply.
>>> 
>>> Why PCKSP in the pressure solve?
>> 
>> I need to solve (on the pressure space) Ax = b with A^-1 = B^-1+ C^-1.
>> I do an additive PCCOMPOSITE, with two PCKSP. B is a mass matrix (PCJACOBI, no problem), C is a shifted Laplacian (PCGAMG, problem…).
> 
> You need C^{-1} to be applied accurately, not just a V-cycle?

Right, rtol of 10^-3 for both solves, otherwise the EPS needs more iterations to converge.

> We could add support for subcommunicators when dof partitions are
> possible, but it make profiling more difficult to interpret and may have
> unexpected consequences if not done in a tightly controlled way.
> 
> 
> Barry, should we change PCCompositeAddPC to use a KSP (or otherwise) so
> that KSPSolve is called like all other composed solvers?

(I’m not sure my usage of PCComposite is the most common so I’m perfectly fine with wrapping everything around PCKSPs)



More information about the petsc-dev mailing list