<head><!-- BaNnErBlUrFlE-HeAdEr-start -->
<style>
#pfptBanner5szzys2 { all: revert !important; display: block !important;
visibility: visible !important; opacity: 1 !important;
background-color: #D0D8DC !important;
max-width: none !important; max-height: none !important }
.pfptPrimaryButton5szzys2:hover, .pfptPrimaryButton5szzys2:focus {
background-color: #b4c1c7 !important; }
.pfptPrimaryButton5szzys2:active {
background-color: #90a4ae !important; }
</style>
<!-- BaNnErBlUrFlE-HeAdEr-end -->
</head><!-- BaNnErBlUrFlE-BoDy-start -->
<!-- Preheader Text : BEGIN -->
<div style="display:none !important;display:none;visibility:hidden;mso-hide:all;font-size:1px;color:#ffffff;line-height:1px;height:0px;max-height:0px;opacity:0;overflow:hidden;">
Ah thanks for that! Sorry I had incorrectly assumed the preconditioner on S would be as it was previously. I'll take a look. Thanks again for all the help. Best, Colton On Thu, May 30, 2024 at 3: 25 PM Barry Smith <bsmith@ petsc. dev>
</div>
<!-- Preheader Text : END -->
<!-- Email Banner : BEGIN -->
<div style="display:none !important;display:none;visibility:hidden;mso-hide:all;font-size:1px;color:#ffffff;line-height:1px;height:0px;max-height:0px;opacity:0;overflow:hidden;">ZjQcmQRYFpfptBannerStart</div>
<!--[if ((ie)|(mso))]>
<table border="0" cellspacing="0" cellpadding="0" width="100%" style="padding: 16px 0px 16px 0px; direction: ltr" ><tr><td>
<table border="0" cellspacing="0" cellpadding="0" style="padding: 0px 10px 5px 6px; width: 100%; border-radius:4px; border-top:4px solid #90a4ae;background-color:#D0D8DC;"><tr><td valign="top">
<table align="left" border="0" cellspacing="0" cellpadding="0" style="padding: 4px 8px 4px 8px">
<tr><td style="color:#000000; font-family: 'Arial', sans-serif; font-weight:bold; font-size:14px; direction: ltr">
This Message Is From an External Sender
</td></tr>
<tr><td style="color:#000000; font-weight:normal; font-family: 'Arial', sans-serif; font-size:12px; direction: ltr">
This message came from outside your organization.
</td></tr>
</table>
</td></tr></table>
</td></tr></table>
<![endif]-->
<![if !((ie)|(mso))]>
<div dir="ltr" id="pfptBanner5szzys2" style="all: revert !important; display:block !important; text-align: left !important; margin:16px 0px 16px 0px !important; padding:8px 16px 8px 16px !important; border-radius: 4px !important; min-width: 200px !important; background-color: #D0D8DC !important; background-color: #D0D8DC; border-top: 4px solid #90a4ae !important; border-top: 4px solid #90a4ae;">
<div id="pfptBanner5szzys2" style="all: unset !important; float:left !important; display:block !important; margin: 0px 0px 1px 0px !important; max-width: 600px !important;">
<div id="pfptBanner5szzys2" style="all: unset !important; display:block !important; visibility: visible !important; background-color: #D0D8DC !important; color:#000000 !important; color:#000000; font-family: 'Arial', sans-serif !important; font-family: 'Arial', sans-serif; font-weight:bold !important; font-weight:bold; font-size:14px !important; line-height:18px !important; line-height:18px">
This Message Is From an External Sender
</div>
<div id="pfptBanner5szzys2" style="all: unset !important; display:block !important; visibility: visible !important; background-color: #D0D8DC !important; color:#000000 !important; color:#000000; font-weight:normal; font-family: 'Arial', sans-serif !important; font-family: 'Arial', sans-serif; font-size:12px !important; line-height:18px !important; line-height:18px; margin-top:2px !important;">
This message came from outside your organization.
</div>
</div>
<div style="clear: both !important; display: block !important; visibility: hidden !important; line-height: 0 !important; font-size: 0.01px !important; height: 0px"> </div>
</div>
<![endif]>
<div style="display:none !important;display:none;visibility:hidden;mso-hide:all;font-size:1px;color:#ffffff;line-height:1px;height:0px;max-height:0px;opacity:0;overflow:hidden;">ZjQcmQRYFpfptBannerEnd</div>
<!-- Email Banner : END -->
<!-- BaNnErBlUrFlE-BoDy-end -->
<div dir="ltr"><div>Ah thanks for that! Sorry I had incorrectly assumed the preconditioner on S would be as it was previously. I'll take a look. <br></div><div><br></div><div>Thanks again for all the help.</div><div><br></div><div>Best,</div><div>Colton <br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 30, 2024 at 3:25 PM Barry Smith <<a href="mailto:bsmith@petsc.dev">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Preconditioner for the Schur complement formed from A11<div><br></div><div> But </div><div><br></div><div> PC Object: (fieldsplit_pressure_) 1 MPI process<blockquote type="cite"><div dir="ltr"> type: ilu<br> out-of-place factorization<br> 0 levels of fill<br> tolerance for zero pivot 2.22045e-14<br> matrix ordering: natural<br> factor fill ratio given 1., needed 1.<br> Factored matrix follows:<br> Mat Object: (fieldsplit_pressure_) 1 MPI process<br> type: seqaij<br> rows=25, cols=25<br> package used to perform factorization: petsc<br> total: nonzeros=105, allocated nonzeros=105<br> not using I-node routines</div></blockquote><div><br></div> So it is trying to do ILU on A11 (to use as the preconditioner of the Schur complement). But A11 is identically zero so ILU will produce a zero pivot</div><div><br></div><div> So you need to set a different preconditioner for S. This can be done with </div><div><br></div><div> -pc_fieldsplit_schur_precondition <self,selfp,user,a11,full> See PCFieldSplitSetSchurPre</div><div><br></div><div> First use </div><div><br></div><div> -pc_fieldsplit_schur_precondition full -fieldsplit_pressure_pc_type svd (note the full S is singular so LU should fail).</div><div><br></div><div> and that should make the PCFIELDSPLIT a direct solver. You then explore cheaper options (since computing S explicitly to produce a preconditioner</div><div> is not reasonable except for small problems). So try next</div><div><br></div><div> -pc_fieldsplit_schur_precondition self -fieldsplit_pressure_pc_type jacobi</div><div><br></div><div> For more advanced tuning Matt can help you out once you have the basics working.</div><div><br><div> <br><blockquote type="cite"><div>On May 30, 2024, at 4:49 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Hi Barry,</div><div><br></div><div>Yes, each index set has the correct entries when I checked manually on a small example.</div><div><br></div><div>For the nullspace I was trying to manually build the constant basis on a compatible DMStag containing just the pressure nodes and creating the nullspace from that but that does not seem to work. Using MatNullSpaceCreate(comm,PETSC_TRUE,0,NULL,&sp) does show that the Schur system has the attached null space. However the system still fails to converge with the schur block giving the error</div><div><br></div><div>Linear fieldsplit_pressure_ solve did not converge due to DIVERGED_PC_FAILED iterations 0<br> PC failed due to FACTOR_NUMERIC_ZEROPIVOT <br></div><div><br></div><div>The output from -ksp_view is</div><div><br></div><div>KSP Object: 1 MPI process<br> type: fgmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.<br> right preconditioning<br> using UNPRECONDITIONED norm type for convergence test<br>PC Object: 1 MPI process<br> type: fieldsplit<br> FieldSplit with Schur preconditioner, factorization FULL<br> Preconditioner for the Schur complement formed from A11<br> Split info:<br> Split number 0 Defined by IS<br> Split number 1 Defined by IS<br> KSP solver for A00 block<br> KSP Object: (fieldsplit_velocity_) 1 MPI process<br> type: gmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br> PC Object: (fieldsplit_velocity_) 1 MPI process<br> type: lu<br> out-of-place factorization<br> tolerance for zero pivot 2.22045e-14<br> matrix ordering: nd<br> factor fill ratio given 5., needed 2.17695<br> Factored matrix follows:<br> Mat Object: (fieldsplit_velocity_) 1 MPI process<br> type: seqaij<br> rows=60, cols=60<br> package used to perform factorization: petsc<br> total: nonzeros=1058, allocated nonzeros=1058<br> using I-node routines: found 35 nodes, limit used is 5<br> linear system matrix = precond matrix:<br> Mat Object: (fieldsplit_velocity_) 1 MPI process<br> type: seqaij<br> rows=60, cols=60<br> total: nonzeros=486, allocated nonzeros=486<br> total number of mallocs used during MatSetValues calls=0<br> using I-node routines: found 35 nodes, limit used is 5<br> KSP solver for S = A11 - A10 inv(A00) A01<br> KSP Object: (fieldsplit_pressure_) 1 MPI process<br> type: gmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br> PC Object: (fieldsplit_pressure_) 1 MPI process<br> type: ilu<br> out-of-place factorization<br> 0 levels of fill<br> tolerance for zero pivot 2.22045e-14<br> matrix ordering: natural<br> factor fill ratio given 1., needed 1.<br> Factored matrix follows:<br> Mat Object: (fieldsplit_pressure_) 1 MPI process<br> type: seqaij<br> rows=25, cols=25<br> package used to perform factorization: petsc<br> total: nonzeros=105, allocated nonzeros=105<br> not using I-node routines<br> linear system matrix followed by preconditioner matrix:<br> Mat Object: (fieldsplit_pressure_) 1 MPI process<br> type: schurcomplement<br> rows=25, cols=25<br> has attached null space<br> Schur complement A11 - A10 inv(A00) A01<br> A11<br> Mat Object: (fieldsplit_pressure_) 1 MPI process<br> type: seqaij<br> rows=25, cols=25<br> total: nonzeros=105, allocated nonzeros=105<br> total number of mallocs used during MatSetValues calls=0<br> has attached null space<br> not using I-node routines<br> A10<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=25, cols=60<br> total: nonzeros=220, allocated nonzeros=220<br> total number of mallocs used during MatSetValues calls=0<br> not using I-node routines<br> KSP solver for A00 block viewable with the additional option -fieldsplit_velocity_ksp_view<br> A01<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=60, cols=25<br> total: nonzeros=220, allocated nonzeros=220<br> total number of mallocs used during MatSetValues calls=0<br> using I-node routines: found 35 nodes, limit used is 5<br> Mat Object: (fieldsplit_pressure_) 1 MPI process<br> type: seqaij<br> rows=25, cols=25<br> total: nonzeros=105, allocated nonzeros=105<br> total number of mallocs used during MatSetValues calls=0<br> has attached null space<br> not using I-node routines<br> linear system matrix = precond matrix:<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=85, cols=85<br> total: nonzeros=1031, allocated nonzeros=1031<br> total number of mallocs used during MatSetValues calls=0<br> has attached null space<br> using I-node routines: found 35 nodes, limit used is 5<br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 30, 2024 at 1:53 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 30, 2024, at 3:15 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Hi Barry,</div><div><br></div><div>Do you know of an example that demonstrates this approach? I have tried implementing this using DMStagCreateISFromStencils and then calling PCFieldSplitSetIS with fields named "velocity" and "pressure" respectively, but when I look at -ksp_view the fields are being set to "fieldsplit_face" and "fieldsplit_element" and as problems are not converging I expect the constant null space is not being attached. <br></div></div></div></blockquote><div><br></div> First confirm that each IS has the entries you expect</div><div><br></div><div> Then for the pressure IS are you using PetscObjectCompose((PetscObject*)is,"nullspace", (PetscObject *)sp); where sp is the null space of the pressure variables</div><div>which I think you can create using MatNullSpaceCreate(comm,PETSC_TRUE,0,NULL,&sp);</div><div><br></div><div> PCFIELDSPLIT is suppose to snag this null space that you provided and use it on the Shur system. If you run with -ksp_view it should list what matrices have an attached null space.</div><div><br></div><div><br></div><div><br><blockquote type="cite"><div><div dir="ltr"><div><br></div><div>Thanks, <br></div><div>Colton <br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 23, 2024 at 12:55 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Unfortunately it cannot automatically because -pc_fieldsplit_detect_saddle_point just grabs part of the matrix (having no concept of "what part" so doesn't know to grab the null space information. <div><br></div><div> It would be possible for PCFIELDSPLIT to access the null space of the larger matrix directly as vectors and check if they are all zero in the 00 block, then it would know that the null space only applied to the second block and could use it for the Schur complement.</div><div><br></div><div> Matt, Jed, Stefano, Pierre does this make sense?</div><div><br></div><div> Colton,</div><div> </div><div> Meanwhile the quickest thing you can do is to generate the IS the defines the first and second block (instead of using -pc_fieldsplit_detect_saddle_point) and use PetscObjectCompose to attach the constant null space to the second block with the name "nullspace". PCFIELDSPLIT will then use this null space for the Schur complement solve.</div><div><br></div><div> Barry</div><div><br><div><br><blockquote type="cite"><div>On May 23, 2024, at 2:43 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Yes, the original operator definitely has a constant null space corresponding to the constant pressure mode. I am currently handling this by using the MatSetNullSpace function when the matrix is being created. Does this information get passed to the submatrices of the fieldsplit? <br></div><div><br></div><div>-Colton <br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 23, 2024 at 12:36 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Ok,<div><br></div><div> So what is happening is that GMRES with a restart of 30 is running on the Schur complement system with no preconditioning and LU (as a direct solver) is being used in the application of S (the Schur complement). The convergence of GMRES is stagnating after getting about 8 digits of accuracy in the residual. Then at the second GMRES</div><div>restart it is comparing the explicitly computing residual b - Ax with that computed inside the GMRES algorithm (via a recursive formula) and finding a large difference so generating an error. Since you are using a direct solver on the A_{00} block and it is well-conditioned this problem is not expected.</div><div><br></div><div> Is it possible that the S operator has a null space (perhaps of the constant vector)? Or, relatedly, does your original full matrix have a null space?</div><div><br></div><div> We have a way to associated null spaces of the submatrices in PCFIELDSPLIT by attaching them to the IS that define the fields, but unfortunately not trivially when using -pc_fieldsplit_detect_saddle_point. And sadly the current support seems completely undocumented. </div><div><br></div><div> Barry</div><div><br></div><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 23, 2024, at 2:16 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Hi Barry, <br></div><div><br></div><div>I saw that was reporting as an unused option and the error message I sent was run with -fieldsplit_0_ksp_type preonly.</div><div><br></div><div>-Colton <br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 23, 2024 at 12:13 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348lineBreakAtBeginningOfMessage"><div><br></div><div> Sorry I gave the wrong option. Use -fieldsplit_0_ksp_type preonly</div><div><br></div><div>Barry</div><div><br><blockquote type="cite"><div>On May 23, 2024, at 12:51 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>That produces the error: </div><div><br></div><div>[0]PETSC ERROR: Residual norm computed by GMRES recursion formula 2.68054e-07 is far from the computed residual norm 6.86309e-06 at restart, residual norm at start of cycle 2.68804e-07</div><div><br></div><div>The rest of the error is identical. <br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 23, 2024 at 10:46 AM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Use -pc_fieldsplit_0_ksp_type preonly<div><br></div><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 23, 2024, at 12:43 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>That produces the following error:</div><div><br></div><div>[0]PETSC ERROR: Residual norm computed by GMRES recursion formula 2.79175e-07 is far from the computed residual norm 0.000113154 at restart, residual norm at start of cycle 2.83065e-07<br>[0]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!csOPLvnC21pdtnvyvDYTttQSWUK793-ZufEaNlRAiw7pC_uVTB8bNOe1Yblvqz1RP_6witFXQtpRRrne-a7JoKY-yVhALTcZH2_f3KlmH5Y$" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.21.0, unknown <br>[0]PETSC ERROR: ./mainOversetLS_exe on a arch-linux-c-opt named glass by colton Thu May 23 10:41:09 2024<br>[0]PETSC ERROR: Configure options --download-mpich --with-cc=gcc --with-cxx=g++ --with-debugging=no --with-fc=gfortran COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=arch-linux-c-opt --download-sowing<br>[0]PETSC ERROR: #1 KSPGMRESCycle() at /home/colton/petsc/src/ksp/ksp/impls/gmres/gmres.c:115<br>[0]PETSC ERROR: #2 KSPSolve_GMRES() at /home/colton/petsc/src/ksp/ksp/impls/gmres/gmres.c:227<br>[0]PETSC ERROR: #3 KSPSolve_Private() at /home/colton/petsc/src/ksp/ksp/interface/itfunc.c:905<br>[0]PETSC ERROR: #4 KSPSolve() at /home/colton/petsc/src/ksp/ksp/interface/itfunc.c:1078<br>[0]PETSC ERROR: #5 PCApply_FieldSplit_Schur() at /home/colton/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1203<br>[0]PETSC ERROR: #6 PCApply() at /home/colton/petsc/src/ksp/pc/interface/precon.c:497<br>[0]PETSC ERROR: #7 KSP_PCApply() at /home/colton/petsc/include/petsc/private/kspimpl.h:409<br>[0]PETSC ERROR: #8 KSPFGMRESCycle() at /home/colton/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:123<br>[0]PETSC ERROR: #9 KSPSolve_FGMRES() at /home/colton/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:235<br>[0]PETSC ERROR: #10 KSPSolve_Private() at /home/colton/petsc/src/ksp/ksp/interface/itfunc.c:905<br>[0]PETSC ERROR: #11 KSPSolve() at /home/colton/petsc/src/ksp/ksp/interface/itfunc.c:1078<br>[0]PETSC ERROR: #12 solveStokes() at cartesianStokesGrid.cpp:1403<br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 23, 2024 at 10:33 AM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Run the failing case with also -ksp_error_if_not_converged so we see exactly where the problem is first detected.<div><br><div><br></div><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 23, 2024, at 11:51 AM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Hi Barry,</div><div><br></div><div>Thanks for letting me know about the need to use fgmres in this case. I ran a smaller problem (1230 in the first block) and saw similar behavior in the true residual. <br></div><div><br></div><div>I also ran the same problem with the options -fieldsplit_0_pc_type svd -fieldsplit_0_pc_svd_monitor and get the following output:</div><div> SVD: condition number 1.933639985881e+03, 0 of 1230 singular values are (nearly) zero<br> SVD: smallest singular values: 4.132036392141e-03 4.166444542385e-03 4.669534028645e-03 4.845532162256e-03 5.047038625390e-03<br> SVD: largest singular values : 7.947990616611e+00 7.961437414477e+00 7.961851612473e+00 7.971335373142e+00 7.989870790960e+00</div><div><br></div><div>I would be surprised if the A_{00} block is ill conditioned as it's just a standard discretization of the laplacian with some rows replaced with ones on the diagonal due to interpolations from the overset mesh. I'm wondering if I'm somehow violating a solvability condition of the problem? <br></div><div><br></div><div>Thanks for the help! <br></div><div><br></div><div>-Colton<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, May 22, 2024 at 6:09 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Thanks for the info. I see you are using GMRES inside the Schur complement solver, this is ok but when you do you need to use fgmres as the outer solver. But this is unlikely to be the cause of the exact problem you are seeing.<div><br></div><div> I'm not sure why the Schur complement KSP is suddenly seeing a large increase in the true residual norm. Is it possible the A_{00} block is ill-conditioned?</div><div><br></div><div> Can you run with a smaller problem? Say 2,000 or so in the first block? Is there still a problem?</div><div><br></div><div><br></div><div><br></div><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 22, 2024, at 6:00 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Hi Barry,</div><div><br></div><div>I have not used any other solver parameters in the code and the full set of solver related command line options are those I mentioned in the previous email. <br></div><div><br></div><div>Below is the output from -ksp_view:</div><div><br></div><div>KSP Object: (back_) 1 MPI process<br> type: gmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-08, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br>PC Object: (back_) 1 MPI process<br> type: fieldsplit<br> FieldSplit with Schur preconditioner, blocksize = 1, factorization FULL<br> Preconditioner for the Schur complement formed from S itself<br> Split info:<br> Split number 0 Defined by IS<br> Split number 1 Defined by IS<br> KSP solver for A00 block<br> KSP Object: (back_fieldsplit_0_) 1 MPI process<br> type: gmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br> PC Object: (back_fieldsplit_0_) 1 MPI process<br> type: lu<br> out-of-place factorization<br> tolerance for zero pivot 2.22045e-14<br> matrix ordering: nd<br> factor fill ratio given 5., needed 8.83482<br> Factored matrix follows:<br> Mat Object: (back_fieldsplit_0_) 1 MPI process<br> type: seqaij<br> rows=30150, cols=30150<br> package used to perform factorization: petsc<br> total: nonzeros=2649120, allocated nonzeros=2649120<br> using I-node routines: found 15019 nodes, limit used is 5<br> linear system matrix = precond matrix:<br> Mat Object: (back_fieldsplit_0_) 1 MPI process<br> type: seqaij<br> rows=30150, cols=30150<br> total: nonzeros=299850, allocated nonzeros=299850<br> total number of mallocs used during MatSetValues calls=0<br> using I-node routines: found 15150 nodes, limit used is 5<br> KSP solver for S = A11 - A10 inv(A00) A01<br> KSP Object: (back_fieldsplit_1_) 1 MPI process<br> type: gmres<br> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br> happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-08, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using PRECONDITIONED norm type for convergence test<br> PC Object: (back_fieldsplit_1_) 1 MPI process<br> type: none<br> linear system matrix = precond matrix:<br> Mat Object: (back_fieldsplit_1_) 1 MPI process<br> type: schurcomplement<br> rows=15000, cols=15000<br> Schur complement A11 - A10 inv(A00) A01<br> A11<br> Mat Object: (back_fieldsplit_1_) 1 MPI process<br> type: seqaij<br> rows=15000, cols=15000<br> total: nonzeros=74700, allocated nonzeros=74700<br> total number of mallocs used during MatSetValues calls=0<br> not using I-node routines<br> A10<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=15000, cols=30150<br> total: nonzeros=149550, allocated nonzeros=149550<br> total number of mallocs used during MatSetValues calls=0<br> not using I-node routines<br> KSP solver for A00 block viewable with the additional option -back_fieldsplit_0_ksp_view<br> A01<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=30150, cols=15000<br> total: nonzeros=149550, allocated nonzeros=149550<br> total number of mallocs used during MatSetValues calls=0<br> using I-node routines: found 15150 nodes, limit used is 5<br> linear system matrix = precond matrix:<br> Mat Object: (back_) 1 MPI process<br> type: seqaij<br> rows=45150, cols=45150<br> total: nonzeros=673650, allocated nonzeros=673650<br> total number of mallocs used during MatSetValues calls=0<br> has attached null space<br> using I-node routines: found 15150 nodes, limit used is 5</div><div><br></div><div>Thanks again!</div><div><br></div><div>-Colton<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, May 22, 2024 at 3:39 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div><div> Are you using any other command line options or did you hardwire any solver parameters in the code with, like, KSPSetXXX() or PCSetXXX() Please send all of them.</div><div><br></div><div> Something funky definitely happened when the true residual norms jumped up.</div><div><br></div> Could you run the same thing with -ksp_view and don't use any thing like -ksp_error_if_not_converged so we can see exactly what is being run.<div><br></div><div> Barry</div><div><br id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447m_6479905935081366205m_-972133632825488841lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On May 22, 2024, at 3:21 PM, Colton Bryant <<a href="mailto:coltonbryant2021@u.northwestern.edu" target="_blank">coltonbryant2021@u.northwestern.edu</a>> wrote:</div><br><div><div dir="ltr" id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447m_6479905935081366205m_-972133632825488841pfptBannerebho4i4" style="writing-mode:revert;color:revert;font-family:revert;font-feature-settings:revert;font-kerning:revert;font-size:revert;font-size-adjust:revert;font-stretch:revert;font-variant-alternates:revert;font-variant-caps:revert;font-variant-east-asian:revert;font-variant-ligatures:revert;font-variant-numeric:revert;font-weight:revert;text-orientation:revert;zoom:revert;letter-spacing:revert;background-blend-mode:revert;background-image:revert;background-position:revert;background-repeat:revert;background-size:revert;border-collapse:revert;box-sizing:revert;break-after:revert;break-before:revert;break-inside:revert;caption-side:revert;clear:revert;columns:revert;column-fill:revert;column-gap:revert;column-rule:revert;column-span:revert;empty-cells:revert;float:revert;image-orientation:revert;isolation:revert;line-break:revert;line-height:revert;list-style:revert;mix-blend-mode:revert;object-fit:revert;object-position:revert;outline:revert;overflow:revert;quotes:revert;table-layout:revert;text-align:left;text-align-last:revert;text-decoration:revert;text-emphasis:revert;text-indent:revert;text-justify:revert;text-overflow:revert;text-transform:revert;text-underline-position:revert;vertical-align:revert;white-space:revert;word-break:revert;border-spacing:revert;word-spacing:revert;background-clip:revert;background-origin:revert;text-combine-upright:revert;display:block;opacity:revert;background-color:rgb(208,216,220);border-bottom:revert;border-left:revert;border-right:revert;height:revert;max-height:revert;max-width:revert;min-height:revert;width:revert;margin:16px 0px;padding:8px 16px;border-radius:4px;min-width:200px;border-top:4px solid rgb(144,164,174)"><div id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447m_6479905935081366205m_-972133632825488841pfptBannerebho4i4" style="writing-mode:unset;color:unset;font-family:unset;font-feature-settings:unset;font-kerning:unset;font-size:unset;font-size-adjust:unset;font-stretch:unset;font-variant-alternates:unset;font-variant-caps:unset;font-variant-east-asian:unset;font-variant-ligatures:unset;font-variant-numeric:unset;font-weight:unset;text-orientation:unset;zoom:unset;letter-spacing:unset;background:unset;background-blend-mode:unset;border-collapse:unset;border:unset;box-sizing:unset;break-after:unset;break-before:unset;break-inside:unset;caption-side:unset;clear:unset;columns:unset;column-fill:unset;column-gap:unset;column-rule:unset;column-span:unset;empty-cells:unset;float:left;image-orientation:unset;isolation:unset;line-break:unset;line-height:unset;list-style:unset;mix-blend-mode:unset;object-fit:unset;object-position:unset;outline:unset;overflow:unset;quotes:unset;table-layout:unset;text-align:unset;text-align-last:unset;text-decoration:unset;text-emphasis:unset;text-indent:unset;text-justify:unset;text-overflow:unset;text-transform:unset;text-underline-position:unset;vertical-align:unset;white-space:unset;word-break:unset;border-spacing:unset;word-spacing:unset;text-combine-upright:unset;display:block;opacity:unset;border-radius:unset;height:unset;max-height:unset;min-height:unset;min-width:unset;padding:unset;width:unset;margin:0px 0px 1px;max-width:600px"><div id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447m_6479905935081366205m_-972133632825488841pfptBannerebho4i4" style="writing-mode:unset;font-family:Arial,sans-serif;font-feature-settings:unset;font-kerning:unset;font-size:14px;font-size-adjust:unset;font-stretch:unset;font-variant-alternates:unset;font-variant-caps:unset;font-variant-east-asian:unset;font-variant-ligatures:unset;font-variant-numeric:unset;font-weight:bold;text-orientation:unset;zoom:unset;letter-spacing:unset;background-blend-mode:unset;background-image:unset;background-position:unset;background-repeat:unset;background-size:unset;border-collapse:unset;border:unset;box-sizing:unset;break-after:unset;break-before:unset;break-inside:unset;caption-side:unset;clear:unset;columns:unset;column-fill:unset;column-gap:unset;column-rule:unset;column-span:unset;empty-cells:unset;float:unset;image-orientation:unset;isolation:unset;line-break:unset;line-height:18px;list-style:unset;mix-blend-mode:unset;object-fit:unset;object-position:unset;outline:unset;overflow:unset;quotes:unset;table-layout:unset;text-align:unset;text-align-last:unset;text-decoration:unset;text-emphasis:unset;text-indent:unset;text-overflow:unset;text-transform:unset;text-underline-position:unset;vertical-align:unset;white-space:unset;word-break:unset;border-spacing:unset;word-spacing:unset;background-clip:unset;background-origin:unset;text-combine-upright:unset;display:block;opacity:unset;background-color:rgb(208,216,220);border-radius:unset;height:unset;margin:unset;max-height:unset;max-width:unset;min-height:unset;min-width:unset;padding:unset;width:unset">This Message Is From an External Sender</div><div id="m_-2773637558804828982m_6963461985037288543m_-7290620286157803825m_-5509071482122398969m_850187303278005167m_-2302358739501703379m_-4038210227445288501m_5179193811939361348m_6235782965983155007m_-5439710146268318838m_3110943171674738447m_6479905935081366205m_-972133632825488841pfptBannerebho4i4" style="writing-mode:unset;font-family:Arial,sans-serif;font-feature-settings:unset;font-kerning:unset;font-size:12px;font-size-adjust:unset;font-stretch:unset;font-variant-alternates:unset;font-variant-caps:unset;font-variant-east-asian:unset;font-variant-ligatures:unset;font-variant-numeric:unset;font-weight:unset;text-orientation:unset;zoom:unset;letter-spacing:unset;background-blend-mode:unset;background-image:unset;background-position:unset;background-repeat:unset;background-size:unset;border-collapse:unset;border:unset;box-sizing:unset;break-after:unset;break-before:unset;break-inside:unset;caption-side:unset;clear:unset;columns:unset;column-fill:unset;column-gap:unset;column-rule:unset;column-span:unset;empty-cells:unset;float:unset;image-orientation:unset;isolation:unset;line-break:unset;line-height:18px;list-style:unset;mix-blend-mode:unset;object-fit:unset;object-position:unset;outline:unset;overflow:unset;quotes:unset;table-layout:unset;text-align:unset;text-align-last:unset;text-decoration:unset;text-emphasis:unset;text-indent:unset;text-overflow:unset;text-transform:unset;text-underline-position:unset;vertical-align:unset;white-space:unset;word-break:unset;border-spacing:unset;word-spacing:unset;background-clip:unset;background-origin:unset;text-combine-upright:unset;display:block;opacity:unset;background-color:rgb(208,216,220);border-radius:unset;height:unset;margin-bottom:unset;margin-left:unset;margin-right:unset;max-height:unset;max-width:unset;min-height:unset;min-width:unset;padding:unset;width:unset;margin-top:2px">This message came from outside your organization.</div></div><div style="height:0px;clear:both;display:block;line-height:0;font-size:0.01px"></div></div><div dir="ltr" style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div>Hello,</div><div><br></div><div>I am solving the Stokes equations on a MAC grid discretized by finite differences using a DMSTAG object. I have tested the solver quite extensively on manufactured problems and it seems to work well. As I am still just trying to get things working and not yet worried about speed I am using the following solver options:<span> </span><br></div><div>-pc_type fieldsplit</div><div>-pc_fieldsplit_detect_saddle_point</div><div>-fieldsplit_0_pc_type lu</div><div>-fieldsplit_1_ksp_rtol 1.e-8<br></div><div><br></div><div>However I am now using this solver as an inner step of a larger code and have run into issues. The code repeatedly solves the Stokes equations with varying right hand sides coming from changing problem geometry (the solver is a part of an overset grid scheme coupled to a level set method evolving in time). After a couple timesteps I observe the following output when running with -fieldsplit_1_ksp_converged_reason -fieldsplit_1_ksp_monitor_true_residual:<span> </span><br></div><div><br></div><div>Residual norms for back_fieldsplit_1_ solve.<br> 0 KSP preconditioned resid norm 2.826514299465e-02 true resid norm 2.826514299465e-02 ||r(i)||/||b|| 1.000000000000e+00<br> 1 KSP preconditioned resid norm 7.286621865915e-03 true resid norm 7.286621865915e-03 ||r(i)||/||b|| 2.577953300039e-01<br> 2 KSP preconditioned resid norm 1.500598474492e-03 true resid norm 1.500598474492e-03 ||r(i)||/||b|| 5.309007192273e-02<br> 3 KSP preconditioned resid norm 3.796396924978e-04 true resid norm 3.796396924978e-04 ||r(i)||/||b|| 1.343137349666e-02<br> 4 KSP preconditioned resid norm 8.091057439816e-05 true resid norm 8.091057439816e-05 ||r(i)||/||b|| 2.862556697960e-03<br> 5 KSP preconditioned resid norm 3.689113122359e-05 true resid norm 3.689113122359e-05 ||r(i)||/||b|| 1.305181128239e-03<br> 6 KSP preconditioned resid norm 2.116450533352e-05 true resid norm 2.116450533352e-05 ||r(i)||/||b|| 7.487846545662e-04<br> 7 KSP preconditioned resid norm 3.968234031201e-06 true resid norm 3.968234031200e-06 ||r(i)||/||b|| 1.403932055801e-04<br> 8 KSP preconditioned resid norm 6.666949419511e-07 true resid norm 6.666949419506e-07 ||r(i)||/||b|| 2.358717739644e-05<br> 9 KSP preconditioned resid norm 1.941522884928e-07 true resid norm 1.941522884931e-07 ||r(i)||/||b|| 6.868965372998e-06<br> 10 KSP preconditioned resid norm 6.729545258682e-08 true resid norm 6.729545258626e-08 ||r(i)||/||b|| 2.380863687793e-06<br> 11 KSP preconditioned resid norm 3.009070131709e-08 true resid norm 3.009070131735e-08 ||r(i)||/||b|| 1.064586912687e-06<br> 12 KSP preconditioned resid norm 7.849353009588e-09 true resid norm 7.849353009903e-09 ||r(i)||/||b|| 2.777043445840e-07<br> 13 KSP preconditioned resid norm 2.306283345754e-09 true resid norm 2.306283346677e-09 ||r(i)||/||b|| 8.159461097060e-08<br> 14 KSP preconditioned resid norm 9.336302495083e-10 true resid norm 9.336302502503e-10 ||r(i)||/||b|| 3.303115255517e-08<br> 15 KSP preconditioned resid norm 6.537456143401e-10 true resid norm 6.537456141617e-10 ||r(i)||/||b|| 2.312903968982e-08<br> 16 KSP preconditioned resid norm 6.389159552788e-10 true resid norm 6.389159550304e-10 ||r(i)||/||b|| 2.260437724130e-08<br> 17 KSP preconditioned resid norm 6.380905134246e-10 true resid norm 6.380905136023e-10 ||r(i)||/||b|| 2.257517372981e-08<br> 18 KSP preconditioned resid norm 6.380440605992e-10 true resid norm 6.380440604688e-10 ||r(i)||/||b|| 2.257353025207e-08<br> 19 KSP preconditioned resid norm 6.380427156582e-10 true resid norm 6.380427157894e-10 ||r(i)||/||b|| 2.257348267830e-08<br> 20 KSP preconditioned resid norm 6.380426714897e-10 true resid norm 6.380426714004e-10 ||r(i)||/||b|| 2.257348110785e-08<br> 21 KSP preconditioned resid norm 6.380426656970e-10 true resid norm 6.380426658839e-10 ||r(i)||/||b|| 2.257348091268e-08<br> 22 KSP preconditioned resid norm 6.380426650538e-10 true resid norm 6.380426650287e-10 ||r(i)||/||b|| 2.257348088242e-08<br> 23 KSP preconditioned resid norm 6.380426649918e-10 true resid norm 6.380426645888e-10 ||r(i)||/||b|| 2.257348086686e-08<br> 24 KSP preconditioned resid norm 6.380426649803e-10 true resid norm 6.380426644294e-10 ||r(i)||/||b|| 2.257348086122e-08<br> 25 KSP preconditioned resid norm 6.380426649796e-10 true resid norm 6.380426649774e-10 ||r(i)||/||b|| 2.257348088061e-08<br> 26 KSP preconditioned resid norm 6.380426649795e-10 true resid norm 6.380426653788e-10 ||r(i)||/||b|| 2.257348089481e-08<br> 27 KSP preconditioned resid norm 6.380426649795e-10 true resid norm 6.380426646744e-10 ||r(i)||/||b|| 2.257348086989e-08<br> 28 KSP preconditioned resid norm 6.380426649795e-10 true resid norm 6.380426650818e-10 ||r(i)||/||b|| 2.257348088430e-08<br> 29 KSP preconditioned resid norm 6.380426649795e-10 true resid norm 6.380426649518e-10 ||r(i)||/||b|| 2.257348087970e-08<br> 30 KSP preconditioned resid norm 6.380426652142e-10 true resid norm 6.380426652142e-10 ||r(i)||/||b|| 2.257348088898e-08<br> 31 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426646799e-10 ||r(i)||/||b|| 2.257348087008e-08<br> 32 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426648077e-10 ||r(i)||/||b|| 2.257348087460e-08<br> 33 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426649048e-10 ||r(i)||/||b|| 2.257348087804e-08<br> 34 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426648142e-10 ||r(i)||/||b|| 2.257348087483e-08<br> 35 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651079e-10 ||r(i)||/||b|| 2.257348088522e-08<br> 36 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650433e-10 ||r(i)||/||b|| 2.257348088294e-08<br> 37 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426649765e-10 ||r(i)||/||b|| 2.257348088057e-08<br> 38 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650364e-10 ||r(i)||/||b|| 2.257348088269e-08<br> 39 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650051e-10 ||r(i)||/||b|| 2.257348088159e-08<br> 40 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651154e-10 ||r(i)||/||b|| 2.257348088549e-08<br> 41 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650246e-10 ||r(i)||/||b|| 2.257348088227e-08<br> 42 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650702e-10 ||r(i)||/||b|| 2.257348088389e-08<br> 43 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651686e-10 ||r(i)||/||b|| 2.257348088737e-08<br> 44 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650870e-10 ||r(i)||/||b|| 2.257348088448e-08<br> 45 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651208e-10 ||r(i)||/||b|| 2.257348088568e-08<br> 46 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651441e-10 ||r(i)||/||b|| 2.257348088650e-08<br> 47 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650955e-10 ||r(i)||/||b|| 2.257348088478e-08<br> 48 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650877e-10 ||r(i)||/||b|| 2.257348088451e-08<br> 49 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426651240e-10 ||r(i)||/||b|| 2.257348088579e-08<br> 50 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426650534e-10 ||r(i)||/||b|| 2.257348088329e-08<br> 51 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426648615e-10 ||r(i)||/||b|| 2.257348087651e-08<br> 52 KSP preconditioned resid norm 6.380426652141e-10 true resid norm 6.380426649523e-10 ||r(i)||/||b|| 2.257348087972e-08<br> 53 KSP preconditioned resid norm 6.380426652140e-10 true resid norm 6.380426652601e-10 ||r(i)||/||b|| 2.257348089061e-08<br> 54 KSP preconditioned resid norm 6.380426652125e-10 true resid norm 6.380427512852e-10 ||r(i)||/||b|| 2.257348393411e-08<br> 55 KSP preconditioned resid norm 6.380426651849e-10 true resid norm 6.380603444402e-10 ||r(i)||/||b|| 2.257410636701e-08<br> 56 KSP preconditioned resid norm 6.380426646751e-10 true resid norm 6.439925413105e-10 ||r(i)||/||b|| 2.278398313542e-08<br> 57 KSP preconditioned resid norm 6.380426514019e-10 true resid norm 2.674218007058e-09 ||r(i)||/||b|| 9.461186902765e-08<br> 58 KSP preconditioned resid norm 6.380425077384e-10 true resid norm 2.406759314486e-08 ||r(i)||/||b|| 8.514937691775e-07<br> 59 KSP preconditioned resid norm 6.380406171326e-10 true resid norm 3.100137288622e-07 ||r(i)||/||b|| 1.096805803957e-05<br> Linear back_fieldsplit_1_ solve did not converge due to DIVERGED_BREAKDOWN iterations 60<br></div><div><br></div><div>Any advice on steps I could take to elucidate the issue would be greatly appreciated. Thanks so much for any help in advance!</div><div><br></div><div>Best,</div><div>Colton Bryant<span> </span></div></div></div></blockquote></div><br></div></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>
</div></blockquote></div><br></div></div></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>
</div></blockquote></div><br></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>
</div></blockquote></div><br></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>