Clique-petsc interface does not work yet. <div>Things need to be done:</div><div>- reuse symbolic factor (Jack added this feature several weeks ago, but I do not have time work on it yet)</div><div>- distributed rhs and solution vector (not sure about clique's status)</div>
<div><br></div><div>Jinquan: did you use Cholesky or LU from mumps?</div><div>For symmetric problem, use mumps Cholesky.</div><div><br></div><div>Hong</div><div><div><br><br><div class="gmail_quote">On Tue, Oct 23, 2012 at 4:14 PM, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Tue, Oct 23, 2012 at 2:48 PM, Jinquan Zhong <span dir="ltr"><<a href="mailto:jzhong@scsolutions.com" target="_blank">jzhong@scsolutions.com</a>></span> wrote:<br>
<div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1f497d">>> Soil-structure interaction.</span></p></div></div></blockquote><div><br></div><div>
Why is it dense? Is it effectively the solution of another equation? An integral operator?</div><div class="im"><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div>
<div><p class="MsoNormal"> </p></div></div></div><div><p class="MsoNormal"><u></u></p>
<div>
<p class="MsoNormal">The amount of fill depends on the size of minimal vertex separators. Sparse matrices with the same number of nonzeros and same number of nonzeros per row can have vertex separators that are orders of magnitude different in size. The fill
is quadratic in the size of the separators and computation is cubic.<u></u><u></u></p>
</div>
</div><div>
<p class="MsoNormal"><span style="color:#1f497d">>> Could you be more specific? I am not quite with you yet.</span></p></div></blockquote><div><br></div></div><div>A 10x10x10000 3D problem with hex elements has about 1M dofs and about 27 nonzeros per row. The minimal vertex separator consists of 10^2 vertices, so the final dense matrix is 100x100. The direct solver is extremely fast for this problem.</div>
<div><br></div><div>A 100x100x100 3D problem has the same number of dofs and nonzeros per row. The minimal vertex separator is 100^2 vertices, so the final dense matrix is 10000x10000 (and there are many pretty big dense matrices to get there). This problem requires on the order of 10000 times as much memory and 1000000 times as many flops.</div>
<div><br></div><div>If we switch from a FEM discretization to a FD discretization with a stencil width of 3, the vertex separator grows by a factor of 3, increasing memory usage by a factor of 9 and flops by a factor of 27. If you replace that high order system with high order continuous Galerkin FEM using larger elements so the number of dofs is constant, the number of nonzeros in the matrix may grow, but the vertex separators go back to being the same as the original problem.</div>
<div class="im">
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><p class="MsoNormal"><u></u><u></u></p>
</div>
<div>
<p class="MsoNormal"><u></u> <u></u></p>
</div>
<div>
<p class="MsoNormal">Is your problem symmetric?<u></u><u></u></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1f497d"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1f497d">>> Yes.</span></p></div></blockquote></div></div><br><div>Hong, is the Clique code in petsc-dev ready to use?</div>
<div><br></div><div>There is no reason we should keep spending time dealing with MUMPS quirks and scalability problems on symmetric problems.</div>
</blockquote></div><br></div></div>