[petsc-users] Hypre + openMP

Matthew Knepley knepley at gmail.com
Tue May 12 23:50:19 CDT 2015


On Tue, May 12, 2015 at 11:41 PM, Michele Rosso <mrosso at uci.edu> wrote:

> Barry,
>
> thanks for your answer. The reason I'm asking is that multigrid limits the
> number of MPI tasks I can use for a given grid, since k multigrid levels
> require at least 2^(k-1) grid nodes per direction. I was wondering if using
> OpenMP together with MPI could help circumventing the problem. If you have
> any other suggestions, it would be greatly appreciated.
>
This is only for naive partitioning. Good AMG necks down the processor set
correctly, for example GAMG in PETSc.

  Thanks,

     Matt


> Best,
> Michele
>
>   You could compile hypre yourself with the OpenMP feature turned on and
> then configure PETSc to use that version of hypre; of course the rest of
> the PETSc code would not utilize the OpenMP threads.
>
>    Barry
>
> BTW: I don't know of any evidence that using hypre with OpenMP is superior
> to using hypre without so I aside from academic curiosity I don't think
> there would be a reason to do this.
>
> > On May 12, 2015, at 7:55 PM, Michele Rosso <mrosso at uci.edu> wrote:
> >
> > Hi,
> >
> > is it possible to use the openmp capabilities of hypre via petsc?
> > Thanks,
> >
> > Michele
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150512/5af77756/attachment.html>


More information about the petsc-users mailing list