[petsc-users] Howto use pARMS
Jed Brown
jed at 59A2.org
Wed Mar 23 22:17:50 CDT 2011
On Thu, Mar 24, 2011 at 06:04, Gong Ding <gdiso at ustc.edu> wrote:
>
> I see. pARMS first do domain decomposition and local ilu in each domain.
> Thanks.
>
> And how to set PC to pARMS in code instead of command line argument?
>
Just like any other PC:
PCSetType(pc,PCPARMS);
>
> The scaling of hypre parallel ilu seems bad.
> I think ilu from pastix can reuse many components of sparse direct solver,
> such as elimination tree, supernode structure, parallel task schedule.
> It may have better performance.
>
Possible, but ILU "stays sparse" compared to full LU that can take advantage
of dense operations and more computation per communication.
>
>
> 2011/3/23 Gong Ding <gdiso at ustc.edu>
>
>> ilu is powerful preconditioner, especially for my problem.
>> However, petsc only has a simple serial version.
>> Hypre parallel ilu does not scalar well.
>> I hole pARMS works fine.
>>
>
> Note that pARMS is not doing parallel ILU. To see the options, run with
>
> -pc_type parms -help | grep parms
>
> This one is especially relevant:
>
> -pc_parms_global <RAS> (choose one of) RAS SCHUR BJ (PCPARMSSetGlobal)
>
> note that SCHUR is nonlinear by default so you should use -ksp_type fgmres
>
>
>
>> BTW, pastix announced it support parallel ilu (kass module),
>> is it possible to call pastix kass in petsc?
>>
>
> That is not implemented, but someone could update the module.
>
> Note that -pc_type hypre -pc_hypre_type euclid is also a parallel ILU.
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110324/94c9d87a/attachment.htm>
More information about the petsc-users
mailing list