[petsc-dev] Bad scaling of GAMG in FieldSplit
Mark Adams
mfadams at lbl.gov
Fri Jul 27 10:21:35 CDT 2018
>
> Everything is fine with GAMG I think, please find the (trimmed) -eps_view
> attached.
>
This looks fine. The eigen estimates are pretty low, but I don't know what
they really are.
> The problem is that, correct me if I’m wrong, there is no easy way to
> redistribute data efficiently from within PETSc when using fieldsplit with
> unbalanced number of unknowns per field.
>
Note, as you scale up the coarse grids become more important re complexity.
So the process reduction and potential repartitioning will become more
noticable. At extreme scale you can spend a majority of the time in the
coarse grids.
> For the other three fields, the solvers are still behaving somehow
> properly. Now if I’d like to optimize this some more, I’d probably need to
> switch from a fieldsplit to a MatNest, with submatrices from different
> communicators, so that I don’t have all processes handling the pressure
> space. But this is apparently not allowed.
>
> Thanks,
> Pierre
>
>
>
>
>> >> 2) have the sub_0_ and the sub_1_ work on two different nonoverlapping
>>> communicators of size PETSC_COMM_WORLD/2, do the solve concurrently, and
>>> then sum the solutions (only worth doing because of -pc_composite_type
>>> additive). I have no idea if this easily doable with PETSc command line
>>> arguments
>>> >
>>> > 1) is the more flexible approach, as you have better control over the
>>> system sizes after 'telescoping’.
>>>
>>> Right, but the advantage of 2) is that I wouldn't have one half or more
>>> of processes idling and I could overlap the solves of both subpc in the
>>> PCCOMPOSITE.
>>>
>>> I’m attaching the -log_view for both runs (I trimmed some options).
>>>
>>> Thanks for your help,
>>> Pierre
>>>
>>>
>>> > Best regards,
>>> > Karli
>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180727/e9ca1ca4/attachment.html>
More information about the petsc-dev
mailing list