[petsc-dev] Preconditioners for GPU
Nystrom, William D
wdn at lanl.gov
Mon May 6 12:43:40 CDT 2013
Please share how that goes. I'd love to hear how it goes for you.
Dave
________________________________
From: Andrea Lani [andrea.lani at gmail.com]
Sent: Monday, May 06, 2013 11:40 AM
To: Nystrom, William D
Cc: Paul Mullowney; petsc-dev at mcs.anl.gov; Nystrom, William D
Subject: Re: [petsc-dev] Preconditioners for GPU
Ok, I ll try jacobi. I am also testing MHD cases by the way (interaction of Solar wind and earth Magnetosphere, both steady and unsteady) :-)
Thx
Andrea
On May 6, 2013, at 7:17 PM, "Nystrom, William D" <wdn at lanl.gov<mailto:wdn at lanl.gov>> wrote:
Andrea,
Have you tried jacobi preconditioning with gpus. I've found it to work pretty
well, surprisingly, and it does work in parallel with mpi. It is also worth seeing
if you can use the ell or hyb matrix formats. I have seen a reduction in compute
time using ell/hyb matrix formats of typically 2x up to as much as 5x when the
iteration counts get large. You should also try both the cusp and cusparse
mat_type. I've tried the above on extended mhd problems. YMMV.
Having said that, I am looking forward to trying better preconditioners for gpus
as they become available. But it would be interesting to see if jacobi preconditioning
on the gpu solves your problem faster than your best preconditioning option on
the cpu. Also, all my problems have been SPD.
Dave
--
Dave Nystrom
LANL HPC-5
Phone: 505-667-7913
Email: <mailto:wdn at lanl.gov> wdn at lanl.gov<mailto:wdn at lanl.gov>
Smail: Mail Stop B272
Group HPC-5
Los Alamos National Laboratory
Los Alamos, NM 87545
________________________________
From: petsc-dev-bounces at mcs.anl.gov<mailto:petsc-dev-bounces at mcs.anl.gov> [petsc-dev-bounces at mcs.anl.gov<mailto:petsc-dev-bounces at mcs.anl.gov>] on behalf of Andrea Lani [andrea.lani at gmail.com<mailto:andrea.lani at gmail.com>]
Sent: Monday, May 06, 2013 10:52 AM
To: Paul Mullowney
Cc: <mailto:petsc-dev at mcs.anl.gov> petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>
Subject: Re: [petsc-dev] Preconditioners for GPU
Thanks Paul! this only work for sequential cases, right? in this case, what performance benefit should I expect compared to a fully CPU-based version, if any? is anything available also for parallel runs?
Andrea
On Mon, May 6, 2013 at 5:48 PM, Paul Mullowney <<mailto:paulm at txcorp.com>paulm at txcorp.com<mailto:paulm at txcorp.com>> wrote:
Hi Andrea,
The matrix type, aijcusparse has an ILU(n) (and ICC(n) for symmetric problems) preconditioner. The factorization is done on the CPU. The solves are done on the GPU via the cusparse library.
In the configure, do --download-txpetscgpu=yes
I would also look at:
<http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Mat/MATSEQAIJCUSPARSE.html>http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Mat/MATSEQAIJCUSPARSE.html
for information on the aijcusparse class.
-Paul
On Sun, May 5, 2013 at 4:26 PM, Andrea Lani <<mailto:andrea.lani at gmail.com>andrea.lani at gmail.com<mailto:andrea.lani at gmail.com>> wrote:
Thanks, Matt! Let me re-iterate on the question... are there other available preconditioners already ported to GPU, which are not based on AMG (which is typically not suitable for my convection-dominated CFD problems) and apart from BICGSTABCUSP?
I believe that the txpetscgpu package has triangular solves for the GPU.
BiCGStab is a Krylov method.
Matt
Andrea
On May 5, 2013, at 9:53 PM, Matthew Knepley <<mailto:knepley at gmail.com>knepley at gmail.com<mailto:knepley at gmail.com>> wrote:
On Sun, May 5, 2013 at 2:48 PM, Andrea Lani <<mailto:andrea.lani at gmail.com>andrea.lani at gmail.com<mailto:andrea.lani at gmail.com>> wrote:
Dear Developers,
Could you please tell me the list of preconditioners for non-symmetric systems fully ported to multi-GPU and available in the current development version?
In my opinion, there are no truly multi-GPU preconditioners anywhere. We can imagine them, like AMG with Chebychev smoothers, but
I have not really seen any of them reliably work. The CUSP SA-AMG is the closest one in this category, and Steve Dalton is working on
it this summer at ANL.
Matt
Thanks in advance
Andrea
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
Dr. Andrea Lani
Senior Research Engineer, PhD
Aeronautics & Aerospace dept., CFD group
Von Karman Institute for Fluid Dynamics
Chausse de Waterloo 72,
B-1640, Rhode-Saint-Genese, Belgium
fax : +32-2-3599600
work : +32-2-3599769
<mailto:lani at vki.ac.be>lani at vki.ac.be<mailto:lani at vki.ac.be>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130506/aadacb82/attachment.html>
More information about the petsc-dev
mailing list