[petsc-users] Fieldsplit, multigrid and DM interaction

Barry Smith bsmith at petsc.dev
Wed Mar 13 14:00:27 CDT 2024



   Sorry no one responded to this email sooner.

> On Mar 12, 2024, at 4:18 AM, Marco Seiz <marco at kit.ac.jp> wrote:
> 
> This Message Is From an External Sender
> This message came from outside your organization.
> Hello,
> 
> 
> I'd like to solve a Stokes-like equation with PETSc, i.e.
> 
> 
> div( mu * symgrad(u) ) = -grad(p) - grad(mu*q)
> 
> div(u) = q
> 
> 
> with the spatially variable coefficients (mu, q) coming from another 
> application, which will advect and evolve fields via the velocity field 
> u from the Stokes solution, and throw back new (mu, q) to PETSc in a 
> loop, everything using finite difference. In preparation for this and 
> getting used to PETSc I wrote a simple inhomogeneous coefficient Poisson 
> solver, i.e.
> 
>   div (mu*grad(u) = -grad(mu*q), u unknown,
> 
> based on src/ksp/ksp/tutorials/ex32.c which converges really nicely even 
> for mu contrasts of 10^10 using -ksp_type fgmres -pc_type mg. Since my 
> coefficients later on can't be calculated from coordinates, I put them 
> on a separate DM and attached it to the main DM via PetscObjectCompose 
> and used a DMCoarsenHookAdd to coarsen the DM the coefficients live on, 
> inspired by src/ts/tutorials/ex29.c .
> 
> Adding another uncoupled DoF was simple enough and it converged 
> according to -ksp_converged_reason, but the solution started looking 
> very weird; roughly constant for each DoF, when it should be some 
> function going from roughly -value to +value due to symmetry. This 
> doesn't happen when I use a direct solver ( -ksp_type preonly -pc_type 
> lu -pc_factor_mat_solver_type umfpack ) and reading the archives, I 
> ought to be using -pc_type fieldsplit due to the block nature of the 
> matrix. I did that and the solution looked sensible again.
Hmm, this sounds like the operator has the constant null space that is accumulating in the iterative method. 

The standard why to handle this is to use MatSetNullSpace() to provide the nullspace information so the iterative solver can remove it at each iteration.

> 
> Now here comes the actual problem: Once I try adding multigrid 
> preconditioning to the split fields I get errors probably relating to 
> fieldsplit not "inheriting" (for lack of a better term) the associated 
> interpolations/added DMs and hooks on the fine DM. That is, when I use 
> the DMDA_Q0 interpolation, fieldsplit dies because it switches to 
> DMDA_Q1 and the size ratio is wrong ( Ratio between levels: (mx - 1)/(Mx 
> - 1) must be integer: mx 64 Mx 32 ). When I use DMDA_Q1, once the KSP 
> tries to setup the matrix on the coarsened problem the DM no longer has 
> the coefficient DMs which I previously had associated with it, i.e. 
> PetscCall(PetscObjectQuery((PetscObject)da, "coefficientdm", 
> (PetscObject *)&dm_coeff)); puts a NULL pointer in dm_coeff and PETSc 
> dies when trying to get a named vector from that, but it works nicely 
> without fieldsplit.
> 
> Is there some way to get fieldsplit to automagically "inherit" those 
> added parts or do I need to manually modify the DMs the fieldsplit is 
> using? I've been using KSPSetComputeOperators since it allows for 
> re-discretization without having to manage the levels myself, whereas 
> some more involved examples like src/dm/impls/stag/tutorials/ex4.c build 
> the matrices in advance when re-discretizing and set them with 
> KSPSetOperators, which would avoid the problem as well but also means 
> managing the levels.

We don't have hooks to get your inner information automatically passed in from the outer DM but I think you can use

PCFieldSplitGetSubKSP() after KSPSetUp()

 to get your two sub-KSPs, you then can set the "sub" DM to these etc to get them to "largely" behave as in your previous "uncoupled" code. 
Hopefully also use KSPSetOperators().

  Barry

> 
> 
> Any advice concerning solving my target Stokes-like equation is welcome 
> as well. I am coming from a explicit timestepping background so reading 
> up on saddle point problems and their efficient solution is all quite 
> new to me.
> 
> 
> Best regards,
> 
> Marco
> 
> 
> 
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240313/b067d6d3/attachment-0001.html>


More information about the petsc-users mailing list