[petsc-users] Setting up MUMPS in PETSc

Matthew Knepley knepley at gmail.com
Tue Oct 23 18:30:29 CDT 2012


On Tue, Oct 23, 2012 at 7:17 PM, Jinquan Zhong <jzhong at scsolutions.com>wrote:

>  Thanks, Jed.****
>
> ** **
>
> Any way to get the condition number.  I used ****
>
> ** **
>
> -ksp_type gmres -ksp_monitor_singular_value -ksp_gmres_restart 1000****
>
> ** **
>
> It didn’t work.
>

This is not helpful. Would you understand this if someone mailed you "It
didn't work"? Send the output.

   Matt


>
>
> Jinquan****
>
> ** **
>
> *From:* petsc-users-bounces at mcs.anl.gov [mailto:
> petsc-users-bounces at mcs.anl.gov] *On Behalf Of *Jed Brown
> *Sent:* Tuesday, October 23, 2012 3:28 PM
> *To:* PETSc users list
> *Subject:* Re: [petsc-users] Setting up MUMPS in PETSc****
>
> ** **
>
> On Tue, Oct 23, 2012 at 5:21 PM, Jinquan Zhong <jzhong at scsolutions.com>
> wrote:****
>
> That is new for me.  What would you suggest, Matt?****
>
>  ** **
>
> Were you using LU or Cholesky before? That is the difference between
> -pc_type lu and -pc_type cholesky. Use -pc_factor_mat_solver_package mumps
> to choose MUMPS. You can access the MUMPS options with
> -mat_mumps_icntl_opaquenumber.****
>
> ** **
>
> It looks like PETSc's Clique interface does not work in parallel. (A
> student was working on it recently and it seems to work in serial.) When it
> is fixed to work in parallel, that is almost certainly what you should use.
> Alternatively, there may well be a Fast method, depending on the structure
> of the system and that fat dense block.****
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121023/9397aeea/attachment.html>


More information about the petsc-users mailing list