[petsc-users] Hypre + openMP

Michele Rosso mrosso at uci.edu
Wed May 13 00:01:49 CDT 2015


Thanks Matt,
Could you give me an example of not-native partitioning? I have a cubic
domain and a 3D domain decomposition. I use dmda3d to create the
partitioning.

Thanks,
Michele
On May 12, 2015 9:51 PM, "Matthew Knepley" <knepley at gmail.com> wrote:

> On Tue, May 12, 2015 at 11:41 PM, Michele Rosso <mrosso at uci.edu> wrote:
>
>> Barry,
>>
>> thanks for your answer. The reason I'm asking is that multigrid limits
>> the number of MPI tasks I can use for a given grid, since k multigrid
>> levels require at least 2^(k-1) grid nodes per direction. I was wondering
>> if using OpenMP together with MPI could help circumventing the problem. If
>> you have any other suggestions, it would be greatly appreciated.
>>
> This is only for naive partitioning. Good AMG necks down the processor set
> correctly, for example GAMG in PETSc.
>
>   Thanks,
>
>      Matt
>
>
>> Best,
>> Michele
>>
>>   You could compile hypre yourself with the OpenMP feature turned on and
>> then configure PETSc to use that version of hypre; of course the rest of
>> the PETSc code would not utilize the OpenMP threads.
>>
>>    Barry
>>
>> BTW: I don't know of any evidence that using hypre with OpenMP is
>> superior to using hypre without so I aside from academic curiosity I don't
>> think there would be a reason to do this.
>>
>> > On May 12, 2015, at 7:55 PM, Michele Rosso <mrosso at uci.edu> wrote:
>> >
>> > Hi,
>> >
>> > is it possible to use the openmp capabilities of hypre via petsc?
>> > Thanks,
>> >
>> > Michele
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150512/9a1e3406/attachment.html>


More information about the petsc-users mailing list