[petsc-users] Failure of MUMPS

Zhang, Junchao jczhang at mcs.anl.gov
Tue Oct 9 16:05:41 CDT 2018


OK, I found -ksp_error_if_not_converged will trigger PETSc to fail in this case.

--Junchao Zhang


On Tue, Oct 9, 2018 at 3:38 PM Junchao Zhang <jczhang at mcs.anl.gov<mailto:jczhang at mcs.anl.gov>> wrote:
I met a case where MUMPS returned an out-of-memory code but PETSc continued to run.  When PETSc calls MUMPS, it checks if (A->erroriffailure). I added -mat_error_if_failure, but it did not work since it was overwritten by MatSetErrorIfFailure(pc->pmat,pc->erroriffailure)
Does it suggest we should add a new option -pc_factor_error_if_failure and check it in PCSetFromOptions_Factor()?

--Junchao Zhang

On Fri, Oct 5, 2018 at 8:12 PM Zhang, Hong <hzhang at mcs.anl.gov<mailto:hzhang at mcs.anl.gov>> wrote:
Mike:
Hello PETSc team:

I am trying to solve a PDE problem with high-order finite elements. The matrix is getting denser and my experience is that MUMPS just outperforms iterative solvers.

For certain problems, MUMPS just fail in the middle for no clear reason. I just wander if there is any suggestion to improve the robustness of MUMPS? Or in general, any suggestion for interative solver with very high-order finite elements?

What error message do you get when MUMPS fails? Out of memory, zero pivoting, or something?
 Hong
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181009/2db66d9f/attachment.html>


More information about the petsc-users mailing list