[petsc-users] PC_SUBPC_ERROR with -pc_factor_zeropivot
Matthew Knepley
knepley at gmail.com
Fri Jul 6 17:25:03 CDT 2018
On Fri, Jul 6, 2018 at 3:16 PM Matthew Overholt <overholt at capesim.com>
wrote:
> I am working on handling very small pivot values for a very small
> percentage of my matrix (linear Ax = b solution), and I am getting an error
> that I don't understand when I run the KSPCG solver in parallel.
>
> KSPCreate(comm, &ksp)
> KSPSetTolerances(ksp, rtol, ...)
> KSPSetType(ksp, KSPCG)
> KSPSetInitialGuessNonzero(ksp, PETSC_TRUE)
> KSPSetFromOptions(ksp)
> ...
> KSPSetOperators(ksp, V, V)
> ...
> KSPSolve(ksp,...)
> KSPGetConvergedReason(ksp, &kspReason)
> if ( kspReason == KSP_DIVERGED_PCSETUP_FAILED )
> PCGetSetUpFailedReason(pc, &pcReason)
> ...
>
> Default Case (zeropivot is 2.22045E-14):
> mpiexec -n 1 ...
> ==> ksp fails due to pcReason = PC_FACTOR_NUMERIC_ZEROPIVOT
>
> Reduced pivot case, n = 1:
> mpiexec -n 1 ... -pc_factor_zeropivot 1E-15
> ==> runs successfully
>
> Reduced pivot case, n > 1:
> mpiexec -n 2 ... -pc_factor_zeropivot 1E-15
> ==> ksp fails due to pcReason = PC_SUBPC_ERROR
>
Okay, what is likely happening is that your preconditioner is actually
Block-Jacobi/LU
instead of parallel LU. Thus you would need
-sub_pc_factor_zeropivot 1e-15
if this is indeed the PC that you want.
> What does this mean?
>
> If I use the MUMPS solver and either Cholesky or LU preconditioning
> instead, it runs fine with any number of MPI ranks,
>
Yes, since MUMPS (or SuperLU_dist) is parallel LU.
> but I'd like to be able to run the CG solver in parallel too.
>
You can use MUMPS as the preconditioner for CG, but that would not make
much sense.
Thanks,
Matt
> Thanks in advance,
>
> Matt Overholt
> CapeSym, Inc.
> (508) 653-7100 x204
> overholt at capesim.com
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180706/03891880/attachment.html>
More information about the petsc-users
mailing list