[petsc-users] Cuda: Vec and Mat types
Junchao Zhang
junchao.zhang at gmail.com
Wed Oct 27 17:16:37 CDT 2021
On Wed, Oct 27, 2021 at 3:37 PM Karthikeyan Chockalingam - STFC UKRI <
karthikeyan.chockalingam at stfc.ac.uk> wrote:
> Thank you for your response.
>
>
>
> It tried running ksp/ex2.c using
>
>
>
> ./ex2 -m 9 -n 9 –vec_type cuda -mat_type aijcusparse -ksp_type cg -pc_type
> jacobi -log_view
>
under src/ksp/ksp/tutorials, run this command (your old command line has a
weird character –)
./ex2 -m 9 -n 9 -vec_type cuda -mat_type aijcusparse -ksp_type cg -pc_type
jacobi -log_view
Event Count Time (sec) Flop
--- Global --- --- Stage ---- Total GPU - CpuToGpu - -
GpuToCpu - GPU
Max Ratio Max Ratio Max Ratio Mess AvgLen
Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s Mflop/s Count Size
Count Size %F
---------------------------------------------------------------------------------------------------------------------------------------------------------------
--- Event Stage 0: Main Stage
MatMult 14 1.0 5.5454e-04 1.0 9.20e+03 1.0 0.0e+00 0.0e+00
0.0e+00 0 40 0 0 0 0 40 0 0 0 17 40 1 4.78e-03 0
0.00e+00 100
MatAssemblyBegin 1 1.0 1.9960e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0 0 0.00e+00 0
0.00e+00 0
MatAssemblyEnd 1 1.0 2.2575e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0 0 0.00e+00 0
0.00e+00 0
MatCUSPARSCopyTo 1 1.0 1.9215e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0 1 4.78e-03 0
0.00e+00 0
VecTDot 26 1.0 7.3121e-04 1.0 4.19e+03 1.0 0.0e+00 0.0e+00
0.0e+00 0 18 0 0 0 0 18 0 0 0 6 9 0 0.00e+00 0
0.00e+00 100
VecNorm 15 1.0 8.1064e-04 1.0 2.42e+03 1.0 0.0e+00 0.0e+00
0.0e+00 0 10 0 0 0 0 10 0 0 0 3 4 0 0.00e+00 0
0.00e+00 100
>
> but the log file didn’t record any gpu flops.
>
>
>
> Sorry, my next question doesn’t belong to this thread.
>
> Does DMDA only work on structured grid/mesh and not on unstructured
> grid/mesh?
>
>
>
> Best,
>
> Karthik.
>
>
>
> *From: *Junchao Zhang <junchao.zhang at gmail.com>
> *Date: *Wednesday, 27 October 2021 at 21:13
> *To: *"Chockalingam, Karthikeyan (STFC,DL,HC)" <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *"petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Cuda: Vec and Mat types
>
>
>
>
>
>
>
> On Wed, Oct 27, 2021 at 2:24 PM Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
> Hello,
>
>
>
> I hope, I am framing the question currently.
>
> Are only distributed arrays (DMDA) of -vec_type and -mat_type only
> supported by CUDA?
>
> I don't understand this question. Currently, CUDA-capable types include
> VECCUDA, MATAIJCUDA and MATDENSECUDA, either sequential or MPI.
>
>
>
> I am reading the petsc user manual in section 2.4 distributed arrays are
> introduced but at the start of chapter two there are other vector and
> matrix types as well. I wonder if these types (I don’t how they are
> referred by) are also CUDA supported?
>
>
>
> Can you please point me to some tutorial examples in KSP and SNES that can
> run on gpus?
>
> search "-mat_type aijcusparse" or "-dm_mat_type aijcusparse" in petsc
> tests/tutorials, you will find many.
>
>
>
>
>
> At the moment I am testing KSP/ex45.c with different preconditioners on
> cpus and gpus.
>
>
>
> I tried to run KSP/ex2.c with -vec_type cuda and -mat_type aijcuda noticed
> there was no gpu flops recorded in my log file.
>
> It is -mat_type aijcusparse
>
>
>
>
>
> Many thanks,
>
> Karthik.
>
> This email and any attachments are intended solely for the use of the
> named recipients. If you are not the intended recipient you must not use,
> disclose, copy or distribute this email or any of its attachments and
> should notify the sender immediately and delete this email from your
> system. UK Research and Innovation (UKRI) has taken every reasonable
> precaution to minimise risk of this email or any attachments containing
> viruses or malware but the recipient should carry out its own virus and
> malware checks before opening the attachments. UKRI does not accept any
> liability for any losses or damages which the recipient may sustain due to
> presence of any viruses.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211027/79897ea3/attachment.html>
More information about the petsc-users
mailing list