<div dir="ltr">Hey Jed,<div><br></div><div>Right now they are uncoupled purely for testing purposes. I just want to get the uncoupled problem right before moving on to the coupled problem. </div><div><br></div><div>When I add the vertical dynamics back in both the outer nonlinear convergence and the inner linear convergence are roughly the same as for when I solve for the 2D slabs only as a 3D system.</div><div><br></div><div>I like your scaling idea. I tried using the pressure multiplied by the mass matrix as a preconditioner for the inner linear problem, but this didn't help. Perhaps some sort of scaling is the way to go though.</div><div><br></div><div>Cheers, Dave.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Oct 11, 2019 at 2:45 PM Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Why are your slabs decoupled at present? (Have you done a transform in<br>
the vertical?) Is the linear convergence significantly different when<br>
you include the multiple layers?<br>
<br>
Dave Lee <<a href="mailto:davelee2804@gmail.com" target="_blank">davelee2804@gmail.com</a>> writes:<br>
<br>
> Hi Jed and Mark,<br>
><br>
> thanks for your helpful comments. Yes the nonlinear outer problem is<br>
> uncoupled between the slabs, it is only the linear inner problem where they<br>
> are coupled.<br>
><br>
> I've tried to make the slab DOFs close in memory, and also tried using a<br>
> tight tolerance on the outer KSP (1.0e-20), but without success.<br>
><br>
> You make some really good points about scaling issues Jed and Mark. This is<br>
> a solve for a global atmosphere, and each 2D slab is a horizontal layer of<br>
> the atmosphere, so the pressure (which the linear solve is for) will vary<br>
> dramatically between slabs. Perhaps I can additionally precondition the<br>
> linear problem to normalise the pressure in each slab so that they stay<br>
> close to unity.<br>
><br>
> Cheers, Dave.<br>
><br>
> On Fri, Oct 11, 2019 at 2:52 AM Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> wrote:<br>
><br>
>> I can think of a few sources of coupling in the solver: 1) line search and<br>
>> 2) Krylov, and 3) the residual test (scaling issues). You could turn<br>
>> linesearch off and use Richardson (with a fixed number of iterations) or<br>
>> exact solves as Jed suggested. As far as scaling can you use the same NL<br>
>> problem on each slab? This should fix all the problems anyway. Or, on the<br>
>> good decouple solves, if the true residual is of the same scale and *all*<br>
>> of the slabs converge well then you should be OK on scaling. If this works<br>
>> then start adding stuff back in and see what breaks it.<br>
>><br>
>> On Thu, Oct 10, 2019 at 11:01 AM Jed Brown via petsc-users <<br>
>> <a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:<br>
>><br>
>>> Dave Lee via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> writes:<br>
>>><br>
>>> > Hi PETSc,<br>
>>> ><br>
>>> > I have a nonlinear 3D problem for a set of uncoupled 2D slabs. (Which I<br>
>>> > ultimately want to couple once this problem is solved).<br>
>>> ><br>
>>> > When I solve the inner linear problem for each of these 2D slabs<br>
>>> > individually (using KSPGMRES) the convergence of the outer nonlinear<br>
>>> > problem is good. However when I solve the inner linear problem as a<br>
>>> single<br>
>>> > 3D problem (with no coupling between the 2D slabs, so the matrix is<br>
>>> > effectively a set of uncoupled blocks, one for each 2D slab) the outer<br>
>>> > nonlinear convergence degrades dramatically.<br>
>>><br>
>>> Is the nonlinear problem also decoupled between slabs?<br>
>>><br>
>>> If you solve the linear problem accurately (tight tolerances on the<br>
>>> outer KSP, or global direct solve), is the outer nonlinear convergence<br>
>>> good again? If not, test that your Jacobian is correct (it likely isn't<br>
>>> or you have inconsistent scaling leading to ill conditioning). SNES has<br>
>>> automatic tests for that, but you aren't using SNES so you'd have to<br>
>>> write some code.<br>
>>><br>
>>> What happens if you run the 2D problem (where convergence is currently<br>
>>> good) with much smaller subdomains (or -pc_type pbjacobi)?<br>
>>><br>
>>> > Note that I am not using SNES, just my own quasi-Newton approach for the<br>
>>> > outer nonlinear problem.<br>
>>> ><br>
>>> > I suspect that the way to recover the convergence for the 3D coupled<br>
>>> > problem is to use some sort of PCBJACOBI or PCFIELDSPLIT preconditioner,<br>
>>> > but I'm not entirely sure. I've tried following this example:<br>
>>> ><br>
>>> <a href="https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex7.c.html" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex7.c.html</a><br>
>>> > but without any improvement in the convergence.<br>
>>> ><br>
>>> > On each processor the slab index is the major index and the internal<br>
>>> slab<br>
>>> > DOF index is the minor index. The parallel decomposition is within the<br>
>>> 2D<br>
>>> > slab dimensions only, not between slabs.<br>
>>><br>
>>> For convergence, you usually want the direction of tight coupling<br>
>>> (sounds like that is within slabs) to be close in memory.<br>
>>><br>
>>> In general, use -ksp_monitor_true_residual -ksp_converged_reason.<br>
>>><br>
>>> > I'm configuring the 3D inner linear solver as<br>
>>> ><br>
>>> > KSPGetPC(ksp, &pc);<br>
>>> > PCSetType(pc, PCBJACOBI);<br>
>>> > PCBJacobiSetLocalBlocks(pc, num_slabs, NULL);<br>
>>> > KSPSetUp(ksp);<br>
>>> > PCBJacobiGetSubKSP(pc, &nlocal, &first_local, &subksp);<br>
>>> > for(int ii = 0; ii < nlocal; ii++) {<br>
>>> > KSPGetPC(subksp[ii], &subpc);<br>
>>> > PCSetType(subpc, PCJACOBI);<br>
>>> > KSPSetType(subksp[ii], KSPGMRES);<br>
>>> > KSPSetTolerances(subksp[ii], 1.e-12, PETSC_DEFAULT, PETSC_DEFAULT,<br>
>>> > PETSC_DEFAULT);<br>
>>> > }<br>
>>> ><br>
>>> > However this does not seem to change the outer nonlinear convergence.<br>
>>> When<br>
>>> > I run with ksp_view the local block sizes look correct - number of<br>
>>> local 2D<br>
>>> > slab DOFs for each block on each processor.<br>
>>> ><br>
>>> > Any ideas on what sort of KSP/PC configuration I should be using to<br>
>>> recover<br>
>>> > the convergence of the uncoupled 2D slab linear solve for this 3D linear<br>
>>> > solve? Should I be rethinking my node indexing to help with this?<br>
>>> ><br>
>>> > Cheers, Dave.<br>
>>><br>
>><br>
</blockquote></div>