[petsc-users] Strong and weak scaling, solver robustness [was Re: reference]
Jed Brown
jed at 59A2.org
Thu Oct 3 17:18:27 CDT 2013
Cc'ing petsc-users since this may be of interest there.
Alexander Grayver <agrayver at gfz-potsdam.de> writes:
> Jed,
>
> thanks.
>
> I actually realized that I don't quite understand the definition of
> scalable as "if its performance is independent of resolution and
> parallelism". Could you shortly comment on what is resolution here?
> I thought it is more conventional to define scalability in terms of
> strong and weak graphs?
Strong scaling is well-defined in the sense that you fix the problem and
ask how fast you can solve it by adding parallelism. It has two parts,
the implementation quality (which can include some math related to
structuring a given communication pattern) and algorithmic adaptations
to expose increasingly-fine-grained parallelism. The methods
appropriate on a serial CPU are quite different from those appropriate
on a GPU or a supercomputer, so it would be unfair to compare the same
method in both places. We can ask that the algorithmic changes needed
to expose fine-grained parallelism in a fixed target problem (e.g.,
moving from a workstation to a supercomputer) not affect convergence too
much.
Weak scaling is trickier to define. Supposing we are solving a linear
PDE. If you take a fixed geometry and coefficient field and then refine
the hell out of the mesh, you are creating much easier (smoother)
problems where the original discretization doesn't make sense. That is,
for a rough coefficient field, you probably use a low-order compatible
discretization, but as you go to high accuracy on very smooth solutions,
you _should_ have switched to a high order method. Similarly, multigrid
is much easier because the grid hierarchy doesn't have to homogenize
because solutions are smooth/resolved throughout the grid hierarchy.
So instead, the only fair thing to do with weak scaling is to solve
problems people could care about at the higher resolution. This either
means looking at bigger domains filled with heterogeneity at the same
scale, or adding further fine structure. That fine structure needs to
not have some "easy" structure like scale separation lest you have
anomalous convergence behavior. If the target problem is nonlinear, you
would like the nonlinearity to have the same stiffness at all scales.
Nonlinearities are rarely scale-free.
Consequently, for many applications, it's not possible to do a "fair"
weak scaling. You can solve a collection of problems of different
sizes, but you can't hope for the convergence behavior to be directly
comparable.
You can test a solver for robustness in the sense of taking a fixed
problem size and watching the convergence behavior as you manipulate the
material coefficients and parameters. You'll almost never converge at a
rate truly independent of those parameters when using a scalable method.
> On 02.10.2013 13:50, Jed Brown wrote:
>> Alexander Grayver <agrayver at gfz-potsdam.de> writes:
>>
>>> Hi Jed,
>>>
>>> I liked your definition of scalable, robust and efficient solvers in CIG
>>> Webinar.
>>> Have you got any paper or proceedings where you talk about these issues
>>> so that I could cite it?
>> Hmm, I have a conference position paper about new research in support of
>> those ideals.
>>
>> @inproceedings{brown2013dimension,
>> title={Vectorization, communication aggregation, and reuse in stochastic and temporal dimensions},
>> author={Jed Brown},
>> booktitle={Exascale Mathematics Workshop, Aug 21--22, Washington, DC},
>> year={2013},
>> organization={DOE Office of Advanced Scientific Computing Research}
>> }
>>
>> Bill Gropp has some relevant software papers.
>>
>> @article{gropp2005issues,
>> title={{Issues in accurate and reliable use of parallel computing in numerical programs}},
>> author={Gropp, W.D.},
>> journal={Accuracy and Reliability in Scientific Computing},
>> pages={253--263},
>> year={2005}
>> }
>>
>> @inproceedings{gropp1999exploiting,
>> title={{Exploiting existing software in libraries: Successes, Failures, and Reasons Why}},
>> author={Gropp, W.D.},
>> booktitle={Proceedings of the SIAM Workshop on Object Oriented Methods for Interoperable Scientific and Engineering Computing},
>> pages={21--29},
>> isbn={0898714451},
>> year={1999},
>> organization={Soc for Industrial \& Applied Math},
>> publisher={SIAM}
>> }
>>
>>
>
>
> --
> Regards,
> Alexander
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131003/23915515/attachment.pgp>
More information about the petsc-users
mailing list