[petsc-users] with-openmp error with hypre

Mark Adams mfadams at lbl.gov
Tue Feb 13 10:12:07 CST 2018


FYI, we were able to get hypre with threads working on KNL on Cori by going
down to -O1 optimization. We are getting about 2x speedup with 4 threads
and 16 MPI processes per socket. Not bad.

There error, flatlined or slightly diverging hypre solves, occurred even in
flat MPI runs with openmp=1.

We are going to test the Haswell nodes next.

On Thu, Jan 25, 2018 at 4:16 PM, Mark Adams <mfadams at lbl.gov> wrote:

> Baky (cc'ed) is getting a strange error on Cori/KNL at NERSC. Using maint
> it runs fine with -with-openmp=0, it runs fine with -with-openmp=1 and
> gamg, but with hypre and -with-openmp=1, even running with flat MPI, the
> solver seems flatline (see attached and notice that the residual starts to
> creep after a few time steps).
>
> Maybe you can suggest a hypre test that I can run?
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180213/4793698a/attachment-0001.html>


More information about the petsc-users mailing list