[petsc-users] Fieldsplit with LSC for constrained elasticity/poroelasticity?
Tabrez Ali
stali at geology.wisc.edu
Tue Oct 28 11:11:35 CDT 2014
Matt and Jed
But it does appear to work well in practice (as in I get the correct
solution). Based on what I have tried, LSC (with GAMG on 00) is the only
solver I am able to use on relatively large problems with non trivial
geometries (e.g., see my original email).
So why is that?
I will experiment with your other suggestions.
Thanks,
Tabrez
On 10/28/2014 09:55 AM, Matthew Knepley wrote:
> On Tue, Oct 28, 2014 at 9:49 AM, Tabrez Ali <stali at geology.wisc.edu
> <mailto:stali at geology.wisc.edu>> wrote:
>
> Matt
>
> The system is certainly not singular (checked using -pc_type svd
> -pc_svd_monitor). It also gives the correct solution (compared to
> the analytical solution for a trivial problem).
>
>
> Something is very wrong with the preconditioner then. Move back to
> full Schur complement with a tight tolerance
> on the solver for S. This should converge in 1 iterate with a good
> true residual. Perhaps there is something wrong
> with LSC for your system. It is not guaranteed to work for all saddle
> points.
>
> Matt
>
>
> Tabrez
>
>
> On 10/28/2014 08:44 AM, Matthew Knepley wrote:
>> On Tue, Oct 28, 2014 at 8:00 AM, Tabrez Ali
>> <stali at geology.wisc.edu <mailto:stali at geology.wisc.edu>> wrote:
>>
>> Matt
>>
>> With MUMPS it is indeed the same but not faster. E.g., see
>> 1.txt and 2.txt
>>
>>
>> There is a big problem here. This system is never actually being
>> solved. Look at 1.txt:
>>
>> 0 KSP preconditioned resid norm 9.320055451716e+05 true resid
>> norm 1.755998647494e+02 ||r(i)||/||b|| 1.000000000000e+00
>> 380 KSP preconditioned resid norm 9.034871458425e-05 true resid
>> norm 3.118451181896e+09 ||r(i)||/||b|| 1.775884728811e+07
>>
>> The preconditioned residual did drop 10 order of magnitude, but
>> the true residual (b - Ax) is still very very large.
>> This looks like the system is singular and the preconditioner is
>> masking this. This needs to be figured out first.
>> I would go back to a small serial problem, and make sure it is
>> actually being solved.
>>
>> Matt
>>
>> With GAMG, cg takes forever (see 4.txt). For reference GAMG
>> with preonly is attached again (3.txt)
>>
>> All logs include -fieldsplit_0_ksp_monitor.
>>
>> Also, the true residual shows small oscillations. Is that normal?
>>
>> Tabrez
>>
>>
>> On 10/28/2014 07:08 AM, Matthew Knepley wrote:
>>> On Tue, Oct 28, 2014 at 5:39 AM, Tabrez Ali
>>> <stali at geology.wisc.edu <mailto:stali at geology.wisc.edu>> wrote:
>>>
>>> Mark
>>>
>>> When I replace "-fieldsplit_0_ksp_type preonly" with
>>> "-fieldsplit_0_ksp_type cg" then it becomes very slow
>>> (had to kill it).
>>>
>>>
>>> That means something in the setup is wrong. It should be
>>> about the same or faster. Run with -fieldsplit_0_ksp_monitor
>>> so we can see what is happening.
>>>
>>> Matt
>>>
>>> With MUMPS , i.e., with '-fieldsplit_0_pc_type lu
>>> -fieldsplit_0_pc_factor_mat_solver_package mumps
>>> -fieldsplit_0_ksp_type preonly' it works fine but takes
>>> more time, and will be an issue for larger problems. The
>>> output for this run is attached.
>>>
>>> I will work on passing rigid body modes (as Matt
>>> mentioned) but short of that what is the best set of
>>> options for solving the following problem (i.e., linear
>>> elasticity with constraints):
>>>
>>> |K cG'| | u | = |F|
>>> |G 0 | |l/c| |d|
>>>
>>> where c is a scaling factor (so that cG' terms are more
>>> or less of the same order as K)? The constraints are
>>> used to impose slip between surfaces and so on.
>>>
>>> Tabrez
>>>
>>> On 10/27/2014 01:17 PM, Mark Adams wrote:
>>>> The null space for GAMG is not critical but useful for
>>>> elasticity. If you in fact have an indefinite operator
>>>> (eg, not "pinned) the you need to use an iterative
>>>> coarse grid solver. You are
>>>> using '-fieldsplit_0_pc_type gamg
>>>> -fieldsplit_0_ksp_type preonly'. And you have a hard
>>>> elasticity problem. You are going to want to start
>>>> with a stronger solver. Use cg instead of preonly. As
>>>> Matt said start with MUMPS, then go to CG/GAMG, then
>>>> you can see how far you can cut the _0_ solver down.
>>>> Mark
>>>>
>>>> On Thu, Oct 23, 2014 at 11:51 AM, Matthew Knepley
>>>> <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>>>>
>>>> On Thu, Oct 23, 2014 at 10:48 AM, Tabrez Ali
>>>> <stali at geology.wisc.edu
>>>> <mailto:stali at geology.wisc.edu>> wrote:
>>>>
>>>> Matt
>>>>
>>>> On 10/23/2014 09:54 AM, Matthew Knepley wrote:
>>>>> On Thu, Oct 23, 2014 at 9:27 AM, Tabrez Ali
>>>>> <stali at geology.wisc.edu
>>>>> <mailto:stali at geology.wisc.edu>> wrote:
>>>>>
>>>>> Matt
>>>>>
>>>>> Sorry about that (I always forget it). The
>>>>> output for the smallest problem is now
>>>>> attached (see log.txt). I am also
>>>>> attaching some results that compare
>>>>> results obtained using FS/LSC and the
>>>>> direct solver (MUMPS), again for the
>>>>> smallest problem. The difference, as you
>>>>> can see is insignificant O(1E-6).
>>>>>
>>>>>
>>>>> 1) How do you use MUMPS if you have a saddle point
>>>> I simply used -pc_type lu
>>>> -pc_factor_mat_solver_package mumps.
>>>>
>>>>>
>>>>> 2) You can see from the output that something
>>>>> is seriously wrong with the preconditioner. It
>>>>> looks like it has a null space.
>>>>> Did you add the elastic null modes to
>>>>> GAMG? Without this, it is not going to work.
>>>>> We have helper functions for this:
>>>>>
>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMPlexCreateRigidBody.html
>>>>>
>>>>> you could just copy that code. And then use
>>>>>
>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetNearNullSpace.html
>>>>>
>>>>> I don't see it in the output, so I think this
>>>>> is your problem.
>>>>>
>>>>> In order to test, I would first use MUMPS as
>>>>> the A00 solver and get the Schur stuff worked
>>>>> out. Then I would
>>>>> replace MUMPS with GAMG and tune it until I
>>>>> get back my original convergence.
>>>> I will try this with
>>>> MatNullSpaceCreateRigidBody. Btw does it matter
>>>> if some nodes are pinned?
>>>>
>>>>
>>>> No these are null modes of the operator, not of the
>>>> particular problem.
>>>>
>>>> Matt
>>>>
>>>> Tabrez
>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Matt
>>>>>
>>>>> Also, I did pass 'upper' and 'full' to
>>>>> '-pc_fieldsplit_schur_factorization_type'
>>>>> but the iteration count doesn't improve
>>>>> (in fact, it increases slightly). The
>>>>> attached log is with 'upper'.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Tabrez
>>>>>
>>>>> On 10/23/2014 07:46 AM, Matthew Knepley wrote:
>>>>>> On Thu, Oct 23, 2014 at 7:20 AM, Tabrez
>>>>>> Ali <stali at geology.wisc.edu
>>>>>> <mailto:stali at geology.wisc.edu>> wrote:
>>>>>>
>>>>>> Hello
>>>>>>
>>>>>> I am using the following options
>>>>>> (below) for solving linear
>>>>>> elasticity/poroelasticity problems
>>>>>> involving slip between two surfaces
>>>>>> involving non-trivial geometries,
>>>>>> i.e., elements with high aspect
>>>>>> ratios, large contrasts in material
>>>>>> properties etc. The constraints are
>>>>>> imposed using Lagrange Multipliers.
>>>>>>
>>>>>> A picture (shows displacement
>>>>>> magnitude) is attached. The boundary
>>>>>> nodes, i.e., the base and the four
>>>>>> side are pinned.
>>>>>>
>>>>>> The following options appear to work
>>>>>> well for the saddle point problem:
>>>>>>
>>>>>> -pc_type fieldsplit
>>>>>> -pc_fieldsplit_type schur
>>>>>> -pc_fieldsplit_detect_saddle_point
>>>>>> -fieldsplit_0_pc_type gamg
>>>>>> -fieldsplit_0_ksp_type preonly
>>>>>> -fieldsplit_1_pc_type lsc
>>>>>> -fieldsplit_1_ksp_type preonly
>>>>>> -pc_fieldsplit_schur_fact_type lower
>>>>>> -ksp_monitor
>>>>>>
>>>>>> However, the number of iterations
>>>>>> keep on increasing with the problems
>>>>>> size (see attached plot), e.g.,
>>>>>>
>>>>>> 120K Tets *507* Iterations (KSP
>>>>>> Residual norm 8.827362494659e-05)in
>>>>>> 17 secs on 3 cores
>>>>>> 1 Million Tets *1374* Iterations (KSP
>>>>>> Residual norm 7.164704416296e-05)in
>>>>>> 117 secs on 20 cores
>>>>>> 8 Million Tets *2495* Iterations (KSP
>>>>>> Residual norm 9.101247550026e-05) in
>>>>>> 225 secs on 160 cores
>>>>>>
>>>>>> So what other options should I try to
>>>>>> improve solver performance? Any
>>>>>> tips/insights would be appreciated as
>>>>>> preconditioning is black magic to me.
>>>>>>
>>>>>>
>>>>>> For reports, always run with
>>>>>>
>>>>>> -ksp_view -ksp_monitor_true_residual
>>>>>> -ksp_converged_reason
>>>>>>
>>>>>> so that we can see exactly what you used.
>>>>>>
>>>>>> I believe the default is a diagonal
>>>>>> factorization. Since your outer iterates
>>>>>> are increasing, I would strengthen this
>>>>>> to either upper or full
>>>>>>
>>>>>> -pc_fieldsplit_schur_factorization_type
>>>>>> <upper, full>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>> Thanks in advance.
>>>>>>
>>>>>> Tabrez
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted
>>>>>> before they begin their experiments is
>>>>>> infinitely more interesting than any
>>>>>> results to which their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted
>>>>> before they begin their experiments is
>>>>> infinitely more interesting than any results
>>>>> to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before
>>>> they begin their experiments is infinitely more
>>>> interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>>
>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin
>>> their experiments is infinitely more interesting than any
>>> results to which their experiments lead.
>>> -- Norbert Wiener
>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which their experiments lead.
>> -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141028/bc37e35b/attachment-0001.html>
More information about the petsc-users
mailing list