[petsc-users] Why use MATMPIBAIJ?

Matthew Knepley knepley at gmail.com
Fri Jan 22 04:15:48 CST 2016


On Fri, Jan 22, 2016 at 3:40 AM, Hoang Giang Bui <hgbk2008 at gmail.com> wrote:

> Hi Matt
> I would rather like to set the block size for block P2 too. Why?
>
> Because in one of my test (for problem involves only [u_x u_y u_z]), the
> gmres + Hypre AMG converges in 50 steps with block size 3, whereby it
> increases to 140 if block size is 1 (see attached files).
>

You can still do that. It can be done with options once the decomposition
is working. Its true that these solvers
work better with the block size set. However, if its the P2 Laplacian it
does not really matter since its uncoupled.

This gives me the impression that AMG will give better inversion for "P2"
> block if I can set its block size to 3. Of course it's still an hypothesis
> but worth to try.
>
> Another question: In one of the Petsc presentation, you said the Hypre AMG
> does not scale well, because set up cost amortize the iterations. How is it
> quantified? and what is the memory overhead?
>

I said the Hypre setup cost is not scalable, but it can be amortized over
the iterations. You can quantify this
just by looking at the PCSetUp time as your increase the number of
processes. I don't think they have a good
model for the memory usage, and if they do, I do not know what it is.
However, generally Hypre takes more
memory than the agglomeration MG like ML or GAMG.

  Thanks,

    Matt


>
> Giang
>
> On Mon, Jan 18, 2016 at 5:25 PM, Jed Brown <jed at jedbrown.org> wrote:
>
>> Hoang Giang Bui <hgbk2008 at gmail.com> writes:
>>
>> > Why P2/P2 is not for co-located discretization?
>>
>> Matt typed "P2/P2" when me meant "P2/P1".
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160122/c572934c/attachment.html>


More information about the petsc-users mailing list