[petsc-dev] 3rd party GPU AMG solvers

Mark Adams mfadams at lbl.gov
Fri Jul 14 10:21:35 CDT 2017


On Fri, Jul 14, 2017 at 10:22 AM, Karl Rupp <rupp at iue.tuwien.ac.at> wrote:

>
>     it will nonetheless require a lot of convincing that at best they
>>     get moderate speed-ups, not the 580+x claimed in some of those early
>>     GPU papers...
>>
>>
>> Karli, we are talking about two different things. You are talking about
>> performance, and I applaud you for that, but I am talking about giving
>> customers what they want. They want to investigate GPUs. I will say that I
>> do not anticipate seeing any performance improvement.
>>
>
> it is fine to say that "here is what you can try, but don't expect
> performance gains". My experience is, however, that people then go ahead,
> try it, *and* expect performance gains (of course for inappropriate system
> sizes, etc.). Since they invested time in exploring GPUs, there is an
> implicit expectation that there must be a ROI...
>

Yes. In the particular case that I am working with I have made it clear
that they should expect performance gains. And the postdoc that is doing
this is capable and reasonable and understands that this is not likely to
be faster. If hypre works out of the box, with maybe some manual movement
of data to the GPU, say, then he might do it just to have a bullet.


>
> Best regards,
> Karli
>
>
>
>>
>>         The fusion folks that I work with, and I assume other DOE
>>         offices, are just looking at their codes, subroutine by
>>         subroutine, and having postdocs look at GPUising them. We just
>>         need intelligent answers to their questions. Even if we as
>>         sentient and passionate human being have opinions on the
>>         approach that is implied by their questions, it is part of my
>>         job to just give them a professional answer.
>>
>>
>>     In the past ~18 months I've worked with applications that wanted to
>>     use GPUs in just that manner. Needless to say that you end up with
>>     touching almost everything to actually beat an existing (efficient)
>>     CPU-based application by less than a factor of 2. This involves
>>     MPI-parallel applications; it's much easier to get higher speedups
>>     if you don't need to communicate across ranks.
>>
>>
>>         I have enough now (thanks Jed and Lorena, et al!) to answer the
>>         AMGx question sufficiently, and if you could give me a quick
>>         assessment of where we are with hypre's GPU solver that would be
>>         great.
>>
>>
>>     does "work in progress" suffice? ;-)
>>
>>
>> I will advertise it "as is" (this is a term of art in US law).
>>
>> Thanks again,
>>
>>
>>     Best regards,
>>     Karli
>>
>>
>>
>>
>>
>>
>>         On Thu, Jul 13, 2017 at 11:16 PM, Karl Rupp
>>         <rupp at iue.tuwien.ac.at <mailto:rupp at iue.tuwien.ac.at>
>>         <mailto:rupp at iue.tuwien.ac.at <mailto:rupp at iue.tuwien.ac.at>>>
>>         wrote:
>>
>>              Hi Mark,
>>
>>                  I hear Hypre has support for GPUs in a May release. Any
>>         word on
>>                  the status of using it in PETSc?
>>
>>
>>              as far as I know, it is currently not supported in PETSc.
>>         I'll have
>>              a look at it and see what needs to be done to enable it.
>>
>>
>>                  And we discussed interfacing to AMGx, which is
>> complicated
>>                  (precluded?) by not releasing source. Anything on the
>>         potential
>>                  of interfacing to AMGx?  I think it would be great to
>>         make this
>>                  available. It is on a lot of checkboxes. I would love
>>         to be able
>>                  to say, yea you can use it.
>>
>>
>>              Lorena Barba's group actually interfaced PETSc to AMGx at
>>         some point
>>              (presented at GTC 2016 if I'm not mistaken). I'll reach out
>>         to them,
>>              maybe they have something to contribute.
>>
>>              Best regards,
>>              Karli
>>
>>
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170714/20c9247f/attachment.html>


More information about the petsc-dev mailing list