[petsc-users] Hybrid MPI/OpenMP PETSc

Barry Smith bsmith at mcs.anl.gov
Wed Jul 1 18:39:00 CDT 2015


   I see.

   Barry

> On Jul 1, 2015, at 6:22 PM, W. Miah <wadud.miah at gmail.com> wrote:
> 
> Hi Barry,
> 
> Thanks for the reply. I'm looking to hybridise the TeaLeaf mini app,
> and the Imperial College bunch have shown some performance
> improvements for some CFD codes, but nothing really significant. I
> haven't come across a code that improves massively with hybridisation.
> 
> TeaLeaf is just a small test - nothing significant. We'll see if it
> improves anything. Of course, I won't call MatSetValues within an
> OpenMP region :-) I asked because this is usually called within a loop
> for populating the rows of the matrix.
> 
> Best regards,
> Wadud.
> 
> On 1 July 2015 at 21:24, Barry Smith <bsmith at mcs.anl.gov> wrote:
>> 
>>> On Jul 1, 2015, at 2:26 PM, W. Miah <wadud.miah at gmail.com> wrote:
>>> 
>>> Hello,
>>> 
>>> What is the status of hybrid PETSc (MPI/OpenMP)? Imperial College
>>> developed a hybrid port and was wondering whether this has been merged
>>> with the PETSc trunk.
>> 
>>   At the moment we don't have any support for mixed MPI/OpenMP.
>> 
>>   What do you need it for? Have you known cases where the performance would be much higher with mixed MPI/OpenMP? We've never seen fully open source software where mixed MPI/OpenMP performs much better (or even any better) than pure MPI.
>> 
>> 
>>> Also, is the
>>> MatSetValues subroutine thread safe, namely could it be called within
>>> an OpenMP thread region (not simd)?
>> 
>>   This will very likely never work.
>> 
>>  Barry
>> 
>>> 
>>> Best regards,
>>> Wadud.
>> 
> 
> 
> 
> -- 
> email:   wadud.miah at gmail.com
> mobile: 07905 755604
> gnupg: 2E29 B22F



More information about the petsc-users mailing list