<div dir="ltr"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><span class="gmail-im" style="font-size:12.8px"><br>> Partially unrelated, PC block-jacobi fails with MFFD type not supported, but additive schwarz with 0 overlap, which I think is identical, works fine. Is this a bug?<br><br></span><span style="font-size:12.8px"> Huh, is this related to hypre, or plan PETSc? Please send all information, command line options etc that reproduce the problem, preferably on a PETSc example.</span></blockquote><div><br></div><div> Unrelated to hypre, pure petsc. Using:</div><div><br></div><div>SNESSetJacobian(ctx.snes, ctx.JPre, ctx.JPre, SNESComputeJacobianDefaultColor, fdcoloring);<br></div><div><br></div><div>and -snes_mf_operator,</div><div><br></div><div>-pc_type asm works as expected, ksp_view:</div><div><br></div><div><div><div>PC Object: 32 MPI processes</div><div> type: asm</div><div> total subdomain blocks = 32, amount of overlap = 1</div><div> restriction/interpolation type - RESTRICT</div><div> Local solve is same for all blocks, in the following KSP and PC objects:</div><div> KSP Object: (sub_) 1 MPI processes</div><div> type: preonly</div><div> maximum iterations=10000, initial guess is zero</div><div> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.</div><div> left preconditioning</div><div> using NONE norm type for convergence test</div><div> PC Object: (sub_) 1 MPI processes</div><div> type: ilu</div><div> out-of-place factorization</div><div> 0 levels of fill</div><div> tolerance for zero pivot 2.22045e-14</div><div> matrix ordering: natural</div><div> factor fill ratio given 1., needed 1.</div><div> Factored matrix follows:</div><div> Mat Object: 1 MPI processes</div><div> type: seqaij</div><div> rows=3600, cols=3600</div><div> package used to perform factorization: petsc</div><div> total: nonzeros=690000, allocated nonzeros=690000</div><div> total number of mallocs used during MatSetValues calls =0</div><div> using I-node routines: found 720 nodes, limit used is 5</div><div> linear system matrix = precond matrix:</div><div> Mat Object: 1 MPI processes</div><div> type: seqaij</div><div> rows=3600, cols=3600</div><div> total: nonzeros=690000, allocated nonzeros=690000</div><div> total number of mallocs used during MatSetValues calls =0</div><div> using I-node routines: found 720 nodes, limit used is 5</div><div> linear system matrix followed by preconditioner matrix:</div><div> Mat Object: 32 MPI processes</div><div> type: mffd</div><div> rows=76800, cols=76800</div><div> Matrix-free approximation:</div><div> err=1.49012e-08 (relative error in function evaluation)</div><div> Using wp compute h routine</div><div> Does not compute normU</div><div> Mat Object: 32 MPI processes</div><div> type: mpiaij</div><div> rows=76800, cols=76800</div><div> total: nonzeros=16320000, allocated nonzeros=16320000</div><div> total number of mallocs used during MatSetValues calls =0</div></div></div><div><br></div><div><br></div><div>-pc_type bjacobi:</div><div><br></div><div><div>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</div><div>[0]PETSC ERROR: No support for this operation for this object type</div><div>[0]PETSC ERROR: Not coded for this matrix type</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Release Version 3.8.1, Nov, 04, 2017 </div><div>[0]PETSC ERROR: maDG on a arch-linux2-c-opt named templeton by mlohry Wed Nov 15 08:53:20 2017</div><div>[0]PETSC ERROR: Configure options PETSC_DIR=/home/mlohry/dev/build/external/petsc PETSC_ARCH=arch-linux2-c-opt --with-cc=mpicc --with-cxx=mpic++ --with-fc=0 --with-clanguage=C++ --with-pic=1 --with-debugging=1 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 --with-shared-libraries=1 --download-parmetis --download-metis --download-hypre=yes --download-superlu_dist=yes --with-64-bit-indices</div><div><div>[0]PETSC ERROR: #1 MatGetDiagonalBlock() line 307 in /home/mlohry/dev/build/external/petsc/src/mat/interface/matrix.c</div><div>[0]PETSC ERROR: #2 PCSetUp_BJacobi() line 119 in /home/mlohry/dev/build/external/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c</div><div>[0]PETSC ERROR: #3 PCSetUp() line 924 in /home/mlohry/dev/build/external/petsc/src/ksp/pc/interface/precon.c</div><div>[0]PETSC ERROR: #4 KSPSetUp() line 381 in /home/mlohry/dev/build/external/petsc/src/ksp/ksp/interface/itfunc.c</div><div>[0]PETSC ERROR: #5 KSPSolve() line 612 in /home/mlohry/dev/build/external/petsc/src/ksp/ksp/interface/itfunc.c</div><div>[0]PETSC ERROR: #6 SNESSolve_NEWTONLS() line 224 in /home/mlohry/dev/build/external/petsc/src/snes/impls/ls/ls.c</div><div>[0]PETSC ERROR: #7 SNESSolve() line 4108 in /home/mlohry/dev/build/external/petsc/src/snes/interface/snes.c</div><div>[0]PETSC ERROR: #8 TS_SNESSolve() line 176 in /home/mlohry/dev/build/external/petsc/src/ts/impls/implicit/theta/theta.c</div><div>[0]PETSC ERROR: #9 TSStep_Theta() line 216 in /home/mlohry/dev/build/external/petsc/src/ts/impls/implicit/theta/theta.c</div><div>[0]PETSC ERROR: #10 TSStep() line 4120 in /home/mlohry/dev/build/external/petsc/src/ts/interface/ts.c</div><div>[0]PETSC ERROR: #11 TSSolve() line 4373 in /home/mlohry/dev/build/external/petsc/src/ts/interface/ts.c</div></div></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Nov 15, 2017 at 8:47 AM, Smith, Barry F. <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class=""><br>
<br>
> On Nov 15, 2017, at 6:38 AM, Mark Lohry <<a href="mailto:mlohry@gmail.com">mlohry@gmail.com</a>> wrote:<br>
><br>
> I've found ILU(0) or (1) to be working well for my problem, but the petsc implementation is serial only. Running with -pc_type hypre -pc_hypre_type pilut with default settings has considerably worse convergence. I've tried using -pc_hypre_pilut_factorrowsize (number of actual elements in row) to trick it into doing ILU(0), to no effect.<br>
><br>
> Is there any way to recover classical ILU(k) from pilut?<br>
><br>
> Hypre's docs state pilut is no longer supported, and Euclid should be used for anything moving forward. pc_hypre_boomeramg has options for Euclid smoothers. Any hope of a pc_hypre_type euclid?<br>
<br>
</span> Not unless someone outside the PETSc team decides to put it back in.<br>
<span class="">><br>
><br>
> Partially unrelated, PC block-jacobi fails with MFFD type not supported, but additive schwarz with 0 overlap, which I think is identical, works fine. Is this a bug?<br>
<br>
</span> Huh, is this related to hypre, or plan PETSc? Please send all information, command line options etc that reproduce the problem, preferably on a PETSc example.<br>
<br>
Barry<br>
<br>
><br>
><br>
> Thanks,<br>
> Mark<br>
<br>
</blockquote></div><br></div>