[petsc-users] Setting up MUMPS in PETSc
Jinquan Zhong
jzhong at scsolutions.com
Tue Oct 23 18:17:55 CDT 2012
Thanks, Jed.
Any way to get the condition number. I used
-ksp_type gmres -ksp_monitor_singular_value -ksp_gmres_restart 1000
It didn’t work.
Jinquan
From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown
Sent: Tuesday, October 23, 2012 3:28 PM
To: PETSc users list
Subject: Re: [petsc-users] Setting up MUMPS in PETSc
On Tue, Oct 23, 2012 at 5:21 PM, Jinquan Zhong <jzhong at scsolutions.com<mailto:jzhong at scsolutions.com>> wrote:
That is new for me. What would you suggest, Matt?
Were you using LU or Cholesky before? That is the difference between -pc_type lu and -pc_type cholesky. Use -pc_factor_mat_solver_package mumps to choose MUMPS. You can access the MUMPS options with -mat_mumps_icntl_opaquenumber.
It looks like PETSc's Clique interface does not work in parallel. (A student was working on it recently and it seems to work in serial.) When it is fixed to work in parallel, that is almost certainly what you should use. Alternatively, there may well be a Fast method, depending on the structure of the system and that fat dense block.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121023/65d95d0d/attachment.html>
More information about the petsc-users
mailing list