[petsc-users] Weird behaviour of PCGAMG in coupled poroelasticity
Felipe Giacomelli
fe.wallner at gmail.com
Fri Nov 29 18:14:36 CST 2019
Hello,
I'm trying to solve Biot's poroelasticity (Cryer's sphere problem) through
a fully coupled scheme. Thus, the solution of a single linear system yields
both displacement and pressure fields,
|K L | | u | = |b_u|.
|Q (A + H) | | p | = |b_p|
The linear system is asymmetric, given that the discrete equations were
obtained through the Element based Finite Volume Method (EbFVM). An
unstructured tetrahedral grid is utilised, it has about 10000 nodal points
(not coarse, nor too refined). Therefore, GMRES and GAMG are employed to
solve it.
Furthermore, the program was parallelised through a Domain Decomposition
Method. Thus, each processor works in its subdomain only.
So far, so good. For a given set of poroelastic properties (which are
constant throughout time and space), the speedup increases as more
processors are utilised:
coupling intensity: 7.51e-01
proc solve time [s]
1 314.23
2 171.65
3 143.21
4 149.26 (> 143.21, but ok)
However, after making the problem MORE coupled (different poroelastic
properties), a strange behavior is observed:
coupling intensity: 2.29e+01
proc solve time [s]
1 28909.35
2 192.39
3 181.29
4 14463.63
Recalling that GMRES and GAMG are used, KSP takes about 4300 iterations to
converge when 1 processor is employed. On the other hand, for 2 processors,
KSP takes around 30 iterations to reach convergence. Hence, explaining the
difference between the solution times.
Increasing the coupling even MORE, everything goes as expected:
coupling intensity: 4.63e+01
proc solve time [s]
1 229.26
2 146.04
3 121.49
4 107.80
Because of this, I ask:
* What may be the source of this behavior? Can it be predicted?
* How can I remedy this situation?
At last, are there better solver-pc choices for coupled poroelasticity?
Thank you,
Felipe
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191129/e534cad2/attachment.html>
More information about the petsc-users
mailing list