<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body style="overflow-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;"><div><br></div><blockquote type="cite" style="color: rgb(0, 0, 0);"><div dir="ltr">Must call DMShellSetGlobalVector() or DMShellSetCreateGlobalVector()<br>[0]PETSC ERROR: #1 DMCreateGlobalVector_Shell() at /Users/markadams/Codes/petsc/src/dm/impls/shell/dmshell.c:210</div></blockquote><div><br></div> It looks like you have built a DMSHELL? You need to teach it how to generate global vectors since yours currently does not.<div><br></div><div> Barry</div><div><br><div><br><blockquote type="cite"><div>On May 9, 2023, at 5:40 PM, Mark Adams <mfadams@lbl.gov> wrote:</div><br class="Apple-interchange-newline"><div><div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, May 9, 2023 at 3:01 PM Barry Smith <<a href="mailto:bsmith@petsc.dev">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><br><div><br><blockquote type="cite"><div>On May 9, 2023, at 12:32 PM, Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> wrote:</div><br><div><div dir="ltr">I have a MG hierarchy that I construct manually with DMRefine and DMPlexExtrude.<div><br></div><div>* The solver works great with chevy/sor but with chevy/sor it converges slowly or I get indefinite PC errors from CG. And the eigen estimates in cheby are really high, like 10-15. </div></div></div></blockquote><div><br></div> So with Cheby/SOR it works great but with the exact same options Cheby/SOR it behaves poorly? Are you using some quantum computer and NERSc?<br></div></div></blockquote><div> </div><div>It turned out that I had the sign wrong on my Laplacian point function and so the matrix was negative definite. I'm not sure what really happened exactly but it is sort of behaving better.</div><div>It looks like my prolongation operator is garbage, the coarse grid correction does nothing (cg/jacobi converges in a little less that the number of MG iterations times the sum of pre and post smoothing steps), and the rows sums of P are not 1.</div><div>Not sure what is going on there, but it is probably related to the DM hierarchy not being constructed correctly....</div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><blockquote type="cite"><div><div dir="ltr"><div><br></div><div>* I tried turning galerkin=none and I got this error. </div></div></div></blockquote><div><br></div> This is because without Garkin it needs to restrict the current solution and then compute the coarse grid Jacobian. Since you did not provide a DM that has the ability to even generate coarse grid vectors the process can work. You need a DM that can provide the coarse grid vectors and restrict solutions. Did you forget to pass a DM to the solver?<br></div></div></blockquote><div><br></div><div>The DM does everything. Similar to many examples but I've been checking with snes/tutorials/ex12.c today. </div><div>I do call:</div><div>PetscCall(DMSetCoarseDM(dmhierarchy[r], dmhierarchy[r-1]));</div><div>But I am missing something else that goes on in DMRefineHierarchy, which I can't use because I am semi-coarsening.</div><div>I probably have to build a section on each DM or something, but I have bigger fish to fry at this point.</div><div><br></div><div>(I construct a 2D coarse grid, refine that a number of times and DMPlexExtrude each one the same amount (number and distance), the extruded direction is wrapped around a torus and made periodic. </div><div>The fine grid now looks like, and will eventually be, the grids that tokamak codes use.)</div><div><br></div><div>Thanks,</div><div>Mark</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><blockquote type="cite"><div><div dir="ltr"><div><br></div><div>Any thoughts on either of these issues?</div><div><br></div><div>Thanks,</div><div>Mark</div><div><br></div><div>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Must call DMShellSetGlobalVector() or DMShellSetCreateGlobalVector()<br>[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!<br>[0]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) source: command line<br>[0]PETSC ERROR: Option left: name:-mg_levels_esteig_ksp_type value: cg source: command line<br>[0]PETSC ERROR: Option left: name:-mg_levels_pc_type value: sor source: command line<br>[0]PETSC ERROR: Option left: name:-options_left (no value) source: command line<br>[0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Development GIT revision: v3.19.1-224-g9ed82936d20 GIT Date: 2023-05-07 12:33:48 -0400<br>[0]PETSC ERROR: ./ex96 on a arch-macosx-gnu-O named MarksMac-302.local by markadams Tue May 9 12:26:52 2023<br>[0]PETSC ERROR: Configure options CFLAGS="-g -Wall" CXXFLAGS="-g -Wall" COPTFLAGS=-O CXXOPTFLAGS=-O --with-cc=/usr/local/opt/llvm/bin/clang --with-cxx=/usr/local/opt/llvm/bin/clang++ --download-mpich --with-strict-petscerrorcode --download-triangle=1 --with-x=0 --with-debugging=0 --download-hdf5=1 PETSC_ARCH=arch-macosx-gnu-O<br>[0]PETSC ERROR: #1 DMCreateGlobalVector_Shell() at /Users/markadams/Codes/petsc/src/dm/impls/shell/dmshell.c:210<br>[0]PETSC ERROR: #2 DMCreateGlobalVector() at /Users/markadams/Codes/petsc/src/dm/interface/dm.c:1022<br>[0]PETSC ERROR: #3 DMGetNamedGlobalVector() at /Users/markadams/Codes/petsc/src/dm/interface/dmget.c:377<br>[0]PETSC ERROR: #4 DMRestrictHook_SNESVecSol() at /Users/markadams/Codes/petsc/src/snes/interface/snes.c:649<br>[0]PETSC ERROR: #5 DMRestrict() at /Users/markadams/Codes/petsc/src/dm/interface/dm.c:3407<br>[0]PETSC ERROR: #6 PCSetUp_MG() at /Users/markadams/Codes/petsc/src/ksp/pc/impls/mg/mg.c:1074<br>[0]PETSC ERROR: #7 PCSetUp() at /Users/markadams/Codes/petsc/src/ksp/pc/interface/precon.c:994<br>[0]PETSC ERROR: #8 KSPSetUp() at /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:406<br>[0]PETSC ERROR: #9 KSPSolve_Private() at /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:824<br>[0]PETSC ERROR: #10 KSPSolve() at /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:1070<br>[0]PETSC ERROR: #11 SNESSolve_KSPONLY() at /Users/markadams/Codes/petsc/src/snes/impls/ksponly/ksponly.c:48<br>[0]PETSC ERROR: #12 SNESSolve() at /Users/markadams/Codes/petsc/src/snes/interface/snes.c:4663<br>[0]PETSC ERROR: #13 main() at ex96.c:433<br></div><div><br><div><br></div><div><br></div></div></div>
</div></blockquote></div><br></div></blockquote></div></div>
</div></blockquote></div><br></div></body></html>