On Fri, Jun 29, 2012 at 7:25 PM, RenZhengYong <span dir="ltr"><<a href="mailto:renzhengyong@gmail.com" target="_blank">renzhengyong@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Dear Petscs,<br><br>Use the uniprocessor complex-value based version petsc, I recently successfully make a FETI_DP domain <br>decomposition approach working for 3D electromagnetic induction (earth) problem. The number of iteration of <br>
the interface problem seems to be scalable with regard to the number of sub-domains. <br><br>To do this, I had two subroutines for petsc<br><br>(1) int mat_vec_product_interface_problem(Mat A, Vec X, Vec Y) for matrix-free GMRES solver<br>
(2) int preconditioner_mat_vec(PC pc,Vec X,Vec Y) for shell preconditioner. <br clear="all"><br>Now, I want to solve the interface problem by paralleled GMRES solver so that I can solve real large-scale problems. Could you please tell me the easiest way to accomplish it. Which specific data structures of petsc should be used. I have been using Petsc for 3.5 years, I really want to have a try the real MPI-based Petsc. <br>
</blockquote><div><br></div><div>1) The solver logic should be parallel already since it only uses calls to Vec or Mat functions. The problems will be</div><div> in building data structures.</div><div><br></div><div>2) It looks like your two items above are the things to be parallelized</div>
<div><br></div><div>3) Decide how to partition the problem</div><div><br></div><div>4) Use VecScatter() to communicate data along the interface of your partitions</div><div><br></div><div>I don't think we can give better advice than that without more specific questions. Note that there is</div>
<div>a current effort to put BDDCinto PETSc. You can see it in petsc-dev, as PCBDDC.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks in advance.<br>Have a nice weekeed<br>Zhengyong <br><span class="HOEnZb"><font color="#888888"><br><br>-- <br>Zhengyong Ren<br>AUG Group, Institute of Geophysics<br>Department of Geosciences, ETH Zurich<br>NO H 47 Sonneggstrasse 5<br>
CH-8092, Zürich, Switzerland<br>
Tel: +41 44 633 37561<br>e-mail: <a href="mailto:zhengyong.ren@aug.ig.erdw.ethz.ch" target="_blank">zhengyong.ren@aug.ig.erdw.ethz.ch</a><br>Gmail: <a href="mailto:renzhengyong@gmail.com" target="_blank">renzhengyong@gmail.com</a><br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>