[petsc-users] about parallel preconditioned matrix-free gmres

RenZhengYong renzhengyong at gmail.com
Wed Jul 4 05:24:38 CDT 2012


Hi, Matt,

Thanks a lot for your suggestions.
In the following two subroutines,

(1) int mat_vec_product_interface_problem(Mat A, Vec X, Vec Y) for
matrix-free GMRES solver
(2) int preconditioner_mat_vec(PC pc,Vec X,Vec Y) for shell preconditioner

I use the VecScatter() and VecGetArray() to successfully call the parallel
GMRES solver to solve my problem.


Thanks a lot as usual.
Zhengyong



On Sat, Jun 30, 2012 at 1:30 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Fri, Jun 29, 2012 at 7:25 PM, RenZhengYong <renzhengyong at gmail.com>wrote:
>
>> Dear Petscs,
>>
>> Use the uniprocessor complex-value based version petsc, I recently
>> successfully make a FETI_DP domain
>> decomposition approach working for 3D electromagnetic induction (earth)
>> problem.  The number of iteration of
>> the interface problem seems to be scalable with regard to the number of
>> sub-domains.
>>
>> To do this, I had two subroutines for petsc
>>
>> (1) int mat_vec_product_interface_problem(Mat A, Vec X, Vec Y) for
>> matrix-free GMRES solver
>> (2) int preconditioner_mat_vec(PC pc,Vec X,Vec Y) for shell
>> preconditioner.
>>
>> Now, I want to solve the interface problem by paralleled GMRES solver so
>> that I can solve real large-scale problems.  Could you please tell me the
>> easiest way to accomplish it. Which specific data structures of petsc
>> should be used.  I have been using Petsc for 3.5 years, I really want to
>> have a try the real MPI-based Petsc.
>>
>
> 1) The solver logic should be parallel already since it only uses calls to
> Vec or Mat functions. The problems will be
>     in building data structures.
>
> 2) It looks like your two items above are the things to be parallelized
>
> 3) Decide how to partition the problem
>
> 4) Use VecScatter() to communicate data along the interface of your
> partitions
>
> I don't think we can give better advice than that without more specific
> questions. Note that there is
> a current effort to put BDDCinto PETSc. You can see it in petsc-dev, as
> PCBDDC.
>
>   Thanks,
>
>     Matt
>
>
>> Thanks in advance.
>> Have a nice weekeed
>> Zhengyong
>>
>>
>> --
>> Zhengyong Ren
>> AUG Group, Institute of Geophysics
>> Department of Geosciences, ETH Zurich
>> NO H 47 Sonneggstrasse 5
>> CH-8092, Zürich, Switzerland
>> Tel: +41 44 633 37561
>> e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch
>> Gmail: renzhengyong at gmail.com
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
Zhengyong Ren
AUG Group, Institute of Geophysics
Department of Geosciences, ETH Zurich
NO H 47 Sonneggstrasse 5
CH-8092, Zürich, Switzerland
Tel: +41 44 633 37561
e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch
Gmail: renzhengyong at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120704/c0cd1981/attachment.html>


More information about the petsc-users mailing list