[petsc-users] Very poor speed up performance

Matthew Knepley knepley at gmail.com
Wed Dec 22 19:03:52 CST 2010


On Wed, Dec 22, 2010 at 10:11 AM, Yongjun Chen <yjxd.chen at gmail.com> wrote:

>
>
> On Wed, Dec 22, 2010 at 6:53 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>
>> On Wed, 22 Dec 2010, Yongjun Chen wrote:
>>
>> > On Wed, Dec 22, 2010 at 6:32 PM, Satish Balay <balay at mcs.anl.gov>
>> wrote:
>> >
>> > > Thanks a lot, Satish. It is much clear now. But for the choice of the
>> two,
>> > the program dmidecode does not show this information. Do you know any
>> way to
>> > get it?
>>
>> why do you expect dmidecode to show that?
>>
>> You'll have to look for the CPU/chipset hardware documentation - and
>> look at the details - and sometimes they mention these details..
>>
>> Satish
>>
>
>
> Thanks, Satish. Yes, I need to check it.
> Just now I re-configured PETSC with the option --with-device=ch3:nemsis.
> The results are almost the same as --with-device=ch3:sock. As can be seen in
> the attachment.
> I hope the matrix partitioning - reordering algorithm would have some
> positive effects.
>

1) To see a large gain, the ordering you start with would have to be very
bad. Maybe it is. These
    orderings try to minimize bandwidth, which means minimize communication
in the MatMult.

2) If you use incomplete facotrization, the ordering can have a large effect
on conditioning, so
    number of iterations, which does not improve scalability. This would
impact scalability if you
   use a parallel IC, however all those packages reorder your matrix
already.

In short, I suspect this will not help a lot, except maybe with
conditioning, which is what I was refering to in the quote.

    Matt

-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20101222/94c822b1/attachment.htm>


More information about the petsc-users mailing list