[petsc-users] Field Split PC for Fully-Coupled 3d stationary incompressible Navier-Stokes Solution Algorithm

Dave May dave.mayhem23 at gmail.com
Mon Feb 2 04:49:31 CST 2015


> I used
>
>         CALL PCFieldSplitSetIS(PRECON,PETSC_NULL_CHARACTER,ISU,IERR)
>         ...
>
>
Here are two suggestions to play with:

[1] When using the same object for the operator and preconditioner,
you will need to fieldsplit factorization type = schur.
This require two-splits (U,p).
Thus, your basic field split configuration will look like

-coupledsolve_pc_type fieldsplit
-coupledsolve_pc_fieldsplit_0_fields 0,1,2
-coupledsolve_pc_fieldsplit_1_fields 3
-coupledsolve_pc_fieldsplit_type SCHUR

Petsc has some support to generate approximate pressure schur complements
for you, but these will not be as good as the ones specifically constructed
for you particular discretization.

If you want to perform solves on you scalar sub-problems (e.g. you have a
nice AMG implementation for each scalar block), you will need to split UU
block again (nested fieldsplit)

[2] If you assembled a different operator for your preconditioner in which
the B_pp slot contained a pressure schur complement approximation, you
could use the simpler and likley more robust option (assuming you know of a
decent schur complement approximation for you discretisation and physical
problem)

-coupledsolve_pc_type fieldsplit
-coupledsolve_pc_fieldsplit_type MULTIPLICATIVE

which include you U-p coupling, or just

-coupledsolve_pc_fieldsplit_type ADDITIVE

which would define the following preconditioner
inv(B) = diag( inv(B_uu,) , inv(B_vv) , inv(B_ww) , inv(B_pp) )

Option 2 would be better as your operator doesn't have an u_i-u_j, i != j
coupling and you could use efficient AMG implementations for each scalar
terms associated with u-u, v-v, w-w coupled terms without having to split
again.

Also, fieldsplit will not be aware of the fact that the Auu, Avv, Aww
blocks are all identical - thus it cannot do anything "smart" in order to
save memory. Accordingly, the KSP defined for each u,v,w split will be a
unique KSP object. If your A_ii are all identical and you want to save
memory, you could use MatNest but as Matt will always yell out, "MatNest is
ONLY a memory optimization and should be ONLY be used once all solver
exploration/testing is performed".



> - A_pp is defined as the matrix resulting from the discretization of the
> pressure equation that considers only the pressure related terms.
>

Hmm okay, i assumed for incompressible NS the pressure equation
that the pressure equation would be just \div(u) = 0.


Note that the matrix is not stored as this, since I use field
> interlacing.
>

yeah sure



> >
> >
> > Cheers,
> >
> >   Dave
> >
> >
> >         Each field corresponds to one of the variables (u,v,w,p).
> >         Considering
> >         the corresponding blocks A_.., the non-interlaced matrix would
> >         read as
> >
> >         [A_uu   0     0   A_up]
> >         [0    A_vv    0   A_vp]
> >         [0      0   A_ww  A_up]
> >         [A_pu A_pv  A_pw  A_pp]
> >
> >         where furthermore A_uu = A_vv = A_ww. This might be considered
> >         to
>


> >         further improve the efficiency of the solve.
> >
> >         You find attached the solver output for an analytical test
> >         case with 2e6
> >         cells each having 4 degrees of freedom. I used the
> >         command-line options:
> >
> >         -log_summary
> >         -coupledsolve_ksp_view
> >         -coupledsolve_ksp_monitor
> >         -coupledsolve_ksp_gmres_restart 100
> >         -coupledsolve_pc_factor_levels 1
> >         -coupledsolve_ksp_gmres_modifiedgramschmidt
> >
> >         Regards,
> >         Fabian Gabel
> >
> >
> >
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150202/2ffbd65b/attachment.html>


More information about the petsc-users mailing list