<div dir="ltr">Well, I think you can do Neumann problems with ex13 with a small code change:<div><br></div><div>** run on a unit cube/square with: -dm_plex_box_upper 1,1,1 -dm_plex_box_lower 0,0,0<div>** use a svd coarse grid solver: -mg_coarse_pc_type svd<br><div>** change one line in this example to get a Neumann problem on around 97 you comment this out: //PetscCall(DMAddBoundary(dm, DM_BC_ESSENTIAL, "wall", label, 1, &id, 0, 0, NULL, (void (*)(void))trig_u, NULL, user, NULL));</div><div><br></div><div>This is what I see and the solution :</div><div><br></div><div>(new_py-env) 14:46 86 adams/gamg-fast-filter *= ~/Codes/petsc/src/snes/tests$ ./ex13 -dm_plex_dim 2 -benchmark_it 0 -dm_plex_simplex 1 -dm_plex_box_faces 2,2,1 -dm_refine 2 -petscpartitioner_simple_node_grid 1,1,1 -petscpartitioner_simple_process_grid 2,2,1 -potential_petscspace_degree 2 -petscpartitioner_type simple -snes_type ksponly -dm_view -ksp_type cg -ksp_rtol 1e-12 -snes_lag_jacobian -2 -dm_plex_box_upper 1,1 -dm_plex_box_lower 0,0 -pc_type gamg -pc_gamg_process_eq_limit 200 -pc_gamg_coarse_eq_limit 1000 -pc_gamg_esteig_ksp_type cg -mg_levels_ksp_chebyshev_esteig 0,0.2,0,1.05 -pc_gamg_reuse_interpolation true -pc_gamg_aggressive_square_graph true -pc_gamg_threshold 0.04 -pc_gamg_threshold_scale .25 -pc_gamg_aggressive_coarsening 2 -pc_gamg_mis_k_minimum_degree_ordering true -ksp_monitor -ksp_norm_type unpreconditioned -dm_view hdf5:sol.h5 -potential_view hdf5:sol.h5::append -ksp_converged_reason -mg_coarse_pc_type svd<br>Number equations N = 289<br> 0 KSP Residual norm 2.775536542048e+00<br> 1 KSP Residual norm 2.631487127409e+00<br> 2 KSP Residual norm 5.145745091100e-01<br> 3 KSP Residual norm 9.031411480985e-02<br> 4 KSP Residual norm 1.120911726117e-02<br> 5 KSP Residual norm 1.869571597808e-03<br> 6 KSP Residual norm 2.861476924898e-04<br> 7 KSP Residual norm 6.168949367531e-05<br> 8 KSP Residual norm 1.098972695713e-05<br> 9 KSP Residual norm 1.510979924319e-06<br> 10 KSP Residual norm 1.958281010810e-07<br> 11 KSP Residual norm 2.648997408740e-08<br> 12 KSP Residual norm 3.191829696292e-09<br> 13 KSP Residual norm 4.003245254269e-10<br> 14 KSP Residual norm 6.645229161624e-11<br> 15 KSP Residual norm 1.055754552827e-11<br> 16 KSP Residual norm 9.104749948055e-13<br> Linear solve converged due to CONVERGED_RTOL iterations 16<br></div><div><br></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Sep 18, 2023 at 12:38 PM Mark Adams <<a href="mailto:mfadams@lbl.gov">mfadams@lbl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">There are several but I am updating <a href="https://gitlab.com/petsc/petsc/-/merge_requests/6875/diffs#diff-content-5a7dc636e85ce214f0a7dcf710f45f1e6800c0d4" style="box-sizing:border-box;font-variant-ligatures:none;color:rgb(51,50,56);background-color:rgb(242,242,244);margin-right:0.25rem;word-break:break-all;font-family:"GitLab Sans",-apple-system,"system-ui","Segoe UI",Roboto,"Noto Sans",Ubuntu,Cantarell,"Helvetica Neue",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";font-size:17.5px" target="_blank"><strong title="" aria-describedby="__bv_tooltip_3744__" style="box-sizing:border-box">src/snes/tests/ex13.c </strong></a> now, in branch <a title="adams/gamg-fast-filter" href="https://gitlab.com/petsc/petsc/-/tree/adams/gamg-fast-filter" style="box-sizing:border-box;font-variant-ligatures:none;color:rgb(31,117,203);background-color:rgb(233,243,252);font-size:0.75rem;border-radius:0.25rem;max-width:13rem;padding-left:0.25rem;padding-right:0.25rem;overflow:hidden;text-overflow:ellipsis;outline:rgb(66,143,220) solid 2px" target="_blank">adams/gamg-fast-filter</a>. but main is fine (it should get merged very soon).<div>It does what you want scales forever, but it does not have <span style="color:rgb(14,16,26)">Neumann BCs.</span></div><div><span style="color:rgb(14,16,26)">I might be able to add that, but not today.</span></div><div><span style="color:rgb(14,16,26)"><br></span></div><div><span style="color:rgb(14,16,26)">Mark</span></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Sep 18, 2023 at 11:16 AM Khurana, Parv <<a href="mailto:p.khurana22@imperial.ac.uk" target="_blank">p.khurana22@imperial.ac.uk</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>
<div lang="EN-GB">
<div>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">Hello PETSc users,<u></u><u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">Thank you for this very active community of users and the mailing list.<u></u><u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">I am looking for a PETSc example which solves the Poisson equation on a 2D domain using FEM (or HO-FEM if possible). I would like the following:<u></u><u></u></span></p>
<ol style="margin-top:0cm" start="1" type="1">
<li class="MsoNormal" style="color:rgb(14,16,26)"><span>The example should be formulated fully on PETSc, and to be solved with KSP objects in PETSc.<u></u><u></u></span></li><li class="MsoNormal" style="color:rgb(14,16,26)"><span>The problem should scale up to a few hundred processors (ideally 1000 procs).<u></u><u></u></span></li><li class="MsoNormal" style="color:rgb(14,16,26)"><span>Ideally on a unit square with either Square or Triangle element discretization.<u></u><u></u></span></li><li class="MsoNormal" style="color:rgb(14,16,26)"><span>There should be an option to specify Dirichlet/Neumann-type BCs on the boundaries.<u></u><u></u></span></li></ol>
<p class="MsoNormal"><span style="color:rgb(14,16,26)"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">I was wondering if someone could point me to such an example as I am relatively new to PETSc – and I am trying to avoid reinventing the wheel. I have had a look at Examples 29,32,50
and 66 in the PETSc tutorials – while they are very close to what I need I am not sure if they scale to a few hundred processors. Furthermore, I am aware that I can formulate such a problem with FENICS/Firedrake with relative ease and these software interfaces
with PETSc quite well. However, I am just trying to see if such an application already exists purely in PETSc.<u></u><u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">Thanks, and Best<u></u><u></u></span></p>
<p class="MsoNormal"><span style="color:rgb(14,16,26)">Parv<u></u><u></u></span></p>
<p class="MsoNormal"><u></u> <u></u></p>
</div>
</div>
</div></blockquote></div>
</blockquote></div>