# [petsc-users] Setting up MUMPS in PETSc

Jed Brown jedbrown at mcs.anl.gov
Tue Oct 23 16:14:01 CDT 2012

```On Tue, Oct 23, 2012 at 2:48 PM, Jinquan Zhong <jzhong at scsolutions.com>wrote:

> >> Soil-structure interaction.
>

Why is it dense? Is it effectively the solution of another equation? An
integral operator?

>
>
> **
>
> The amount of fill depends on the size of minimal vertex separators.
> Sparse matrices with the same number of nonzeros and same number of
> nonzeros per row can have vertex separators that are orders of magnitude
> different in size. The fill is quadratic in the size of the separators and
> computation is cubic.****
>
> >> Could you be more specific?  I am not quite with you yet.
>

A 10x10x10000 3D problem with hex elements has about 1M dofs and about 27
nonzeros per row. The minimal vertex separator consists of 10^2 vertices,
so the final dense matrix is 100x100. The direct solver is extremely fast
for this problem.

A 100x100x100 3D problem has the same number of dofs and nonzeros per row.
The minimal vertex separator is 100^2 vertices, so the final dense matrix
is 10000x10000 (and there are many pretty big dense matrices to get there).
This problem requires on the order of 10000 times as much memory and
1000000 times as many flops.

If we switch from a FEM discretization to a FD discretization with a
stencil width of 3, the vertex separator grows by a factor of 3, increasing
memory usage by a factor of 9 and flops by a factor of 27. If you replace
that high order system with high order continuous Galerkin FEM using larger
elements so the number of dofs is constant, the number of nonzeros in the
matrix may grow, but the vertex separators go back to being the same as the
original problem.

> ****
>
> ** **
>
>
> ** **
>
> >> Yes.
>

Hong, is the Clique code in petsc-dev ready to use?

There is no reason we should keep spending time dealing with MUMPS quirks
and scalability problems on symmetric problems.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121023/b78ec831/attachment.html>
```