<div dir="ltr">That sounds right,<div><br></div><div>See the docs and examples at <a href="https://petsc.org/release/docs/manualpages/PC/PCFieldSplitSetIS/">https://petsc.org/release/docs/manualpages/PC/PCFieldSplitSetIS/</a></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Mar 17, 2023 at 1:26 PM Christopher, Joshua via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div class="msg-8455683325234103984">




<div dir="ltr">
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
Hi Barry,</div>
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
Thank you for your response. I'm a little confused about the relation between the IS integer values and matrix indices. From
<a href="https://petsc.org/release/src/snes/tutorials/ex70.c.html" id="m_-8455683325234103984LPlnk810480" target="_blank">
https://petsc.org/release/src/snes/tutorials/ex70.c.html</a> it looks like my IS should just contain a list of the rows for each split? For example, if I have a 100x100 matrix with two fields, "rho" and "phi", the first 50 rows correspond to the "rho" variable
 and the last 50 correspond to the "phi" variable. So I should call PCFieldSplitSetIS twice, the first with an IS containing integers 0-49 and the second with integers 49-99? PCFieldSplitSetIS is expecting global row numbers, correct?<br>
<br>
My matrix is organized as one block after another.<br>
<br>
</div>
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
Thank you,</div>
<div style="font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0);background-color:rgb(255,255,255)">
Joshua</div>
<div id="m_-8455683325234103984appendonsend"></div>
<hr style="display:inline-block;width:98%">
<div id="m_-8455683325234103984divRplyFwdMsg" dir="ltr"><font face="Calibri, sans-serif" style="font-size:11pt" color="#000000"><b>From:</b> Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>><br>
<b>Sent:</b> Tuesday, March 14, 2023 1:35 PM<br>
<b>To:</b> Christopher, Joshua <<a href="mailto:jchristopher@anl.gov" target="_blank">jchristopher@anl.gov</a>><br>
<b>Cc:</b> <a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a> <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
<b>Subject:</b> Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG</font>
<div> </div>
</div>
<div>
<div><br>
</div>
  You definitely do not need to use a complicated DM to take advantage of PCFIELDSPLIT. All you need to do is create two IS on each MPI process. The first should list all the indices of the degrees of freedom of your first type of variable and the second should
 list all the rest of the degrees of freedom. Then use <a href="https://petsc.org/release/docs/manualpages/PC/PCFieldSplitSetIS/" target="_blank">https://petsc.org/release/docs/manualpages/PC/PCFieldSplitSetIS/</a>
<div><br>
</div>
<div>  Barry</div>
<div><br>
</div>
<div>Note: PCFIELDSPLIT does not care how you have ordered your degrees of freedom of the two types. You might interlace them or have all the first degree of freedom on an MPI process and then have all the second degree of freedom. This just determines what
 your IS look like.<br>
<div><br>
</div>
<div><br>
<div><br>
<blockquote type="cite">
<div>On Mar 14, 2023, at 1:14 PM, Christopher, Joshua via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:</div>
<br>
<div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Hello PETSc users,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I haven't heard back from the library developer regarding the numbering issue or my questions on using field split operators with their library, so I need to fix this myself.</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Regarding the natural numbering vs parallel numbering: I haven't figured out what is wrong here. I stepped through in parallel and it looks like each processor is setting up the matrix and calling MatSetValue similar to what is shown in<span> </span><a href="https://petsc.org/release/src/ksp/ksp/tutorials/ex2.c.html" id="m_-8455683325234103984LPlnk784808" target="_blank">https://petsc.org/release/src/ksp/ksp/tutorials/ex2.c.html</a>.
 I see that PETSc is recognizing my simple two-processor test from the output ("PetscInitialize_Common(): PETSc successfully started: number of processors = 2"). I'll keep poking at this, however I'm very new to PETSc. When I print the matrix to ASCII using
 PETSC_VIEWER_DEFAULT, I'm guessing I see one row per line, and the tuples consists of the column number and value?<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
On the FieldSplit preconditioner, is my understanding here correct:<br>
<br>
To use FieldSplit, I must have a DM. Since I have an unstructured mesh, I must use DMPlex and set up the chart and covering relations specific to my mesh following here:<span> </span><a href="https://petsc.org/release/docs/manual/dmplex/" id="m_-8455683325234103984LPlnk603044" target="_blank">https://petsc.org/release/docs/manual/dmplex/</a>.
 I think this may be very time-consuming for me to set up.<span> </span><br>
<br>
Currently, I already have a matrix stored in a parallel sparse L-D-U format. I am converting into PETSc's sparse parallel AIJ matrix (traversing my matrix and using MatSetValues). The weights for my discretization scheme are already accounted for in the coefficients
 of my L-D-U matrix. I do have the submatrices in L-D-U format for each of my two equations' coupling with each other. That is, the equivalent of lines 242,251-252,254 of example 28<span> </span><a href="https://petsc.org/release/src/snes/tutorials/ex28.c.html" id="m_-8455683325234103984LPlnk807111" target="_blank">https://petsc.org/release/src/snes/tutorials/ex28.c.html</a>.
 Could I directly convert my submatrices into PETSc's sub-matrix here, then assemble things together so that the field split preconditioners will work?<br>
<br>
Alternatively, since my L-D-U matrices already account for the discretization scheme, can I use a simple structured grid DM?</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Thank you so much for your help!</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Regards,<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Joshua<br>
</div>
<div id="m_-8455683325234103984x_appendonsend" style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
</div>
<hr style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;display:inline-block;width:1026.05px">
<span style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;float:none;display:inline"></span>
<div id="m_-8455683325234103984x_divRplyFwdMsg" dir="ltr" style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
<font face="Calibri, sans-serif" style="font-size:11pt"><b>From:</b><span> </span>Pierre Jolivet <<a href="mailto:pierre@joliv.et" target="_blank">pierre@joliv.et</a>><br>
<b>Sent:</b><span> </span>Friday, March 3, 2023 11:45 AM<br>
<b>To:</b><span> </span>Christopher, Joshua <<a href="mailto:jchristopher@anl.gov" target="_blank">jchristopher@anl.gov</a>><br>
<b>Cc:</b><span> </span><a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><span> </span><<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
<b>Subject:</b><span> </span>Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG</font>
<div> </div>
</div>
<div style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
For full disclosure, with -ksp_pc_side right -ksp_max_it 100 -ksp_rtol 1E-10:
<div>1) with renumbering via ParMETIS</div>
<div>-pc_type bjacobi -sub_pc_type lu -sub_pc_factor_mat_solver_type mumps => Linear solve converged due to CONVERGED_RTOL iterations 10</div>
<div>-pc_type hypre -pc_hypre_boomeramg_relax_type_down l1-Gauss-Seidel -pc_hypre_boomeramg_relax_type_up backward-l1-Gauss-Seidel => Linear solve converged due to CONVERGED_RTOL iterations 55</div>
<div>2) without renumbering via ParMETIS</div>
<div>-pc_type bjacobi => Linear solve did not converge due to DIVERGED_ITS iterations 100</div>
<div>-pc_type hypre => Linear solve did not converge due to DIVERGED_ITS iterations 100</div>
<div>Using on outer fieldsplit may help fix this.</div>
<div><br>
</div>
<div>Thanks,</div>
<div>Pierre<br>
<div>
<div><br>
<blockquote type="cite">
<div>On 3 Mar 2023, at 6:24 PM, Christopher, Joshua via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:</div>
<br>
<div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I am solving these equations in the context of electrically-driven fluid flows as that first paper describes. I am using a PIMPLE scheme to advance the fluid equations in time, and my goal is to do a coupled solve of the electric equations similar to what is
 described in this paper:<span> </span><a href="https://www.sciencedirect.com/science/article/pii/S0045793019302427" id="m_-8455683325234103984LPNoLPOWALinkPreview" target="_blank">https://www.sciencedirect.com/science/article/pii/S0045793019302427</a>. They are
 using the SIMPLE scheme in this paper. My fluid flow should eventually reach steady behavior, and likewise the time derivative in the charge density should trend towards zero. They preferred using BiCGStab with a direct LU preconditioner for solving their
 electric equations. I tried to test that combination, but my case is halting for unknown reasons in the middle of the PETSc solve. I'll try with more nodes and see if I am running out of memory, but the computer is a little overloaded at the moment so it may
 take a while to run.</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I sent Pierre Jolivet my matrix and RHS, and they said the matrix does not appear to be following a parallel numbering, and instead looks like the matrix has natural numbering. When they renumbered the system with ParMETIS they got really fast convergence.
 I am using PETSc through a library, so I will reach out to the library authors and see if there is an issue in the library.</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Thank you,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Joshua</div>
<div id="m_-8455683325234103984x_x_appendonsend" style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
</div>
<hr style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;display:inline-block;width:1596.41px">
<span style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;float:none;display:inline"></span>
<div id="m_-8455683325234103984x_x_divRplyFwdMsg" dir="ltr" style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
<font face="Calibri, sans-serif" style="font-size:11pt"><b>From:</b><span> </span>Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>><br>
<b>Sent:</b><span> </span>Thursday, March 2, 2023 3:47 PM<br>
<b>To:</b><span> </span>Christopher, Joshua <<a href="mailto:jchristopher@anl.gov" target="_blank">jchristopher@anl.gov</a>><br>
<b>Cc:</b><span> </span><a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><span> </span><<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
<b>Subject:</b><span> </span>Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG</font>
<div> </div>
</div>
<div style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
<div><br>
</div>
<div><br>
</div>
<div><br>
</div>
<span id="m_-8455683325234103984x_cid:DC7076F2-AD4E-4A63-B82D-8C82ED983531"><Untitled.png></span><br>
<div><br>
</div>
<div>  Are you solving this as a time-dependent problem? Using an implicit scheme (like backward Euler) for rho ? In ODE language, solving the differential algebraic equation?</div>
<div><br>
</div>
<div>Is epsilon bounded away from 0? <br>
<br>
<blockquote type="cite">
<div>On Mar 2, 2023, at 4:22 PM, Christopher, Joshua <<a href="mailto:jchristopher@anl.gov" target="_blank">jchristopher@anl.gov</a>> wrote:</div>
<br>
<div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Hi Barry and Mark,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Thank you for looking into my problem. The two equations I am solving with PETSc are equations 6 and 7 from this paper:<a href="https://ris.utwente.nl/ws/portalfiles/portal/5676495/Roghair+Paper_final_draft_v1.pdf" id="m_-8455683325234103984LPlnk149320" target="_blank">https://ris.utwente.nl/ws/portalfiles/portal/5676495/Roghair+Paper_final_draft_v1.pdf</a></div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I just used MUMPS and SuperLU_DIST on my full-size problem (with 3,000,000 unknowns). To clarify, I did a direct solve with -ksp_type preonly. They take a very long time, about 30 minutes for MUMPS and 18 minutes for SuperLU_DIST, see attached output. For reference,
 the same matrix took 658 iterations of BoomerAMG and about 20 seconds of walltime. Maybe I am already getting a great deal with BoomerAMG!<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I'll try removing some terms from my solve (e.g. removing the second equation, then making the second equation just the elliptic portion of the equation, etc.) and try with a simpler geometry. I'll keep you updated as I run into troubles with that route. I
 wasn't aware of Field Split preconditioners, I'll do some reading on them and give them a try as well.<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Thank you again,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Joshua<br>
</div>
<div id="m_-8455683325234103984x_x_x_appendonsend" style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
</div>
<hr style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;display:inline-block;width:1013.31px">
<span style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;float:none;display:inline"></span>
<div id="m_-8455683325234103984x_x_x_divRplyFwdMsg" dir="ltr" style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
<font face="Calibri, sans-serif" style="font-size:11pt"><b>From:</b><span> </span>Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>><br>
<b>Sent:</b><span> </span>Thursday, March 2, 2023 7:47 AM<br>
<b>To:</b><span> </span>Christopher, Joshua <<a href="mailto:jchristopher@anl.gov" target="_blank">jchristopher@anl.gov</a>><br>
<b>Cc:</b><span> </span><a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><span> </span><<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
<b>Subject:</b><span> </span>Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG</font>
<div> </div>
</div>
<div style="font-family:Helvetica;font-size:18px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">
<div><br>
</div>
  Have you tried MUMPS (or SuperLU_DIST) on the full-size problem with the 5,000,000 unknowns? It is at the high end of problem sizes you can do with direct solvers but is worth comparing with  BoomerAMG. You likely want to use more nodes and fewer cores per
 node with MUMPs to be able to access more memory. If you are needing to solve multiple right hand sides but with the same matrix the factors will be reused resulting in the second and later solves being much faster.
<div><br>
</div>
<div>  I agree with Mark, with iterative solvers you are likely to end up with PCFIELDSPLIT.</div>
<div><br>
</div>
<div>  Barry</div>
<div><br>
<div><br>
<blockquote type="cite">
<div>On Mar 1, 2023, at 7:17 PM, Christopher, Joshua via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:</div>
<br>
<div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Hello,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
I am trying to solve the leaky-dielectric model equations with PETSc using a second-order discretization scheme (with limiting to first order as needed) using the finite volume method. The leaky dielectric model is a coupled system of two equations, consisting
 of a Poisson equation and a convection-diffusion equation.  I have tested on small problems with simple geometry (~1000 DoFs) using:<br>
<br>
-ksp_type gmres<span> </span><br>
-pc_type hypre<span> </span><br>
-pc_hypre_type boomeramg<br>
<br>
and I get RTOL convergence to 1.e-5 in about 4 iterations. I tested this in parallel with 2 cores, but also previously was able to use successfully use a direct solver in serial to solve this problem. When I scale up to my production problem, I get significantly
 worse convergence. My production problem has ~3 million DoFs, more complex geometry, and is solved on ~100 cores across two nodes. The boundary conditions change a little because of the geometry, but are of the same classifications (e.g. only Dirichlet and
 Neumann). On the production case, I am needing 600-4000 iterations to converge. I've attached the output from the first solve that took 658 iterations to converge, using the following output options:<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
-ksp_view_pre
<div>-ksp_view</div>
<div>-ksp_converged_reason</div>
<div>-ksp_monitor_true_residual</div>
-ksp_test_null_space</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
My matrix is non-symmetric, the condition number can be around 10e6, and the eigenvalues reported by PETSc have been real and positive (using -ksp_view_eigenvalues).<span> </span><br>
<br>
I have tried using other preconditions (superlu, mumps, gamg, mg) but hypre+boomeramg has performed the best so far. The literature seems to indicate that AMG is the best approach for solving these equations in a coupled fashion.</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Do you have any advice on speeding up the convergence of this system?<span> </span><br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
<br>
</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Thank you,</div>
<div style="font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none;font-family:Calibri,Arial,Helvetica,sans-serif;font-size:12pt;background-color:rgb(255,255,255)">
Joshua<br>
</div>
<span id="m_-8455683325234103984x_x_x_x_cid:B8340654-41D9-48B9-80BA-1D987A4CB56A"><petsc_gmres_boomeramg.txt></span></div>
</blockquote>
</div>
<br>
</div>
</div>
<span id="m_-8455683325234103984x_x_x_cid:07070479-D458-4499-A1EF-04E05CD89253"><petsc_preonly_mumps.txt></span><span id="m_-8455683325234103984x_x_x_cid:ABBF9F30-3BF9-40ED-8933-E6A76797A8F8"><petsc_preonly_superlu.txt></span></div>
</blockquote>
</div>
</div>
</div>
</blockquote>
</div>
</div>
</div>
</div>
</div>
</blockquote>
</div>
<br>
</div>
</div>
</div>
</div>

</div></blockquote></div>