[petsc-users] Matrix format mpiaij does not have a built-in PETSc XXX!
Matthew Knepley
knepley at gmail.com
Tue Feb 28 11:10:38 CST 2012
On Tue, Feb 28, 2012 at 11:06 AM, Bojan Niceno <bojan.niceno at psi.ch> wrote:
> Hi all,
>
> On 2/28/2012 5:41 PM, Matthew Knepley wrote:
>
> Look, I already replied to this, and now Jed had to reply again. If you
> are not going to read our mail, why mail the list?
>
>
> I am reading your messages, all right, but I also read PETSc's errors and
> manuals.
>
> Look what the manual says on ICC:
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCICC.html
>
> "Notes: Only implemented for some matrix formats. Not implemented in
> parallel."
>
>
> And on ILU:
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html
>
> "Notes: Only implemented for some matrix formats. (for parallel see
> PCHYPRE<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCHYPRE.html#PCHYPRE>for hypre's ILU)"
>
>
> So Matt, if I understand your answer correctly, one should use PCASM to
> get ILU in parallel, right? What if I want IC?
>
Or PCBJACOBI, or use Hypre for parallel ILU, or better yet do not use an
unreliable preconditioner with poor scalability. How
does this lead you to conclude that SOR is the only thing you can run in
parallel?
Matt
> Thanks,
>
>
> Bojan
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120228/99237064/attachment.htm>
More information about the petsc-users
mailing list