[petsc-users] Possible to recover ILU(k) from hypre/pilut?
Mark Lohry
mlohry at gmail.com
Wed Nov 15 07:55:37 CST 2017
>
>
> > Partially unrelated, PC block-jacobi fails with MFFD type not supported,
> but additive schwarz with 0 overlap, which I think is identical, works
> fine. Is this a bug?
>
> Huh, is this related to hypre, or plan PETSc? Please send all
> information, command line options etc that reproduce the problem,
> preferably on a PETSc example.
Unrelated to hypre, pure petsc. Using:
SNESSetJacobian(ctx.snes, ctx.JPre, ctx.JPre,
SNESComputeJacobianDefaultColor, fdcoloring);
and -snes_mf_operator,
-pc_type asm works as expected, ksp_view:
PC Object: 32 MPI processes
type: asm
total subdomain blocks = 32, amount of overlap = 1
restriction/interpolation type - RESTRICT
Local solve is same for all blocks, in the following KSP and PC objects:
KSP Object: (sub_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (sub_) 1 MPI processes
type: ilu
out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
matrix ordering: natural
factor fill ratio given 1., needed 1.
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=3600, cols=3600
package used to perform factorization: petsc
total: nonzeros=690000, allocated nonzeros=690000
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 720 nodes, limit used is 5
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=3600, cols=3600
total: nonzeros=690000, allocated nonzeros=690000
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 720 nodes, limit used is 5
linear system matrix followed by preconditioner matrix:
Mat Object: 32 MPI processes
type: mffd
rows=76800, cols=76800
Matrix-free approximation:
err=1.49012e-08 (relative error in function evaluation)
Using wp compute h routine
Does not compute normU
Mat Object: 32 MPI processes
type: mpiaij
rows=76800, cols=76800
total: nonzeros=16320000, allocated nonzeros=16320000
total number of mallocs used during MatSetValues calls =0
-pc_type bjacobi:
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Not coded for this matrix type
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.1, Nov, 04, 2017
[0]PETSC ERROR: maDG on a arch-linux2-c-opt named templeton by mlohry Wed
Nov 15 08:53:20 2017
[0]PETSC ERROR: Configure options
PETSC_DIR=/home/mlohry/dev/build/external/petsc
PETSC_ARCH=arch-linux2-c-opt --with-cc=mpicc --with-cxx=mpic++ --with-fc=0
--with-clanguage=C++ --with-pic=1 --with-debugging=1 COPTFLAGS=-O3
CXXOPTFLAGS=-O3 --with-shared-libraries=1 --download-parmetis
--download-metis --download-hypre=yes --download-superlu_dist=yes
--with-64-bit-indices
[0]PETSC ERROR: #1 MatGetDiagonalBlock() line 307 in
/home/mlohry/dev/build/external/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #2 PCSetUp_BJacobi() line 119 in
/home/mlohry/dev/build/external/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
[0]PETSC ERROR: #3 PCSetUp() line 924 in
/home/mlohry/dev/build/external/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #4 KSPSetUp() line 381 in
/home/mlohry/dev/build/external/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #5 KSPSolve() line 612 in
/home/mlohry/dev/build/external/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #6 SNESSolve_NEWTONLS() line 224 in
/home/mlohry/dev/build/external/petsc/src/snes/impls/ls/ls.c
[0]PETSC ERROR: #7 SNESSolve() line 4108 in
/home/mlohry/dev/build/external/petsc/src/snes/interface/snes.c
[0]PETSC ERROR: #8 TS_SNESSolve() line 176 in
/home/mlohry/dev/build/external/petsc/src/ts/impls/implicit/theta/theta.c
[0]PETSC ERROR: #9 TSStep_Theta() line 216 in
/home/mlohry/dev/build/external/petsc/src/ts/impls/implicit/theta/theta.c
[0]PETSC ERROR: #10 TSStep() line 4120 in
/home/mlohry/dev/build/external/petsc/src/ts/interface/ts.c
[0]PETSC ERROR: #11 TSSolve() line 4373 in
/home/mlohry/dev/build/external/petsc/src/ts/interface/ts.c
On Wed, Nov 15, 2017 at 8:47 AM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
>
> > On Nov 15, 2017, at 6:38 AM, Mark Lohry <mlohry at gmail.com> wrote:
> >
> > I've found ILU(0) or (1) to be working well for my problem, but the
> petsc implementation is serial only. Running with -pc_type hypre
> -pc_hypre_type pilut with default settings has considerably worse
> convergence. I've tried using -pc_hypre_pilut_factorrowsize (number of
> actual elements in row) to trick it into doing ILU(0), to no effect.
> >
> > Is there any way to recover classical ILU(k) from pilut?
> >
> > Hypre's docs state pilut is no longer supported, and Euclid should be
> used for anything moving forward. pc_hypre_boomeramg has options for Euclid
> smoothers. Any hope of a pc_hypre_type euclid?
>
> Not unless someone outside the PETSc team decides to put it back in.
> >
> >
> > Partially unrelated, PC block-jacobi fails with MFFD type not supported,
> but additive schwarz with 0 overlap, which I think is identical, works
> fine. Is this a bug?
>
> Huh, is this related to hypre, or plan PETSc? Please send all
> information, command line options etc that reproduce the problem,
> preferably on a PETSc example.
>
> Barry
>
> >
> >
> > Thanks,
> > Mark
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171115/37a746b5/attachment-0001.html>
More information about the petsc-users
mailing list