[petsc-users] Eisenstat-Walker method with GPU assembled matrices
Héctor Barreiro Cabrera
hecbarcab at gmail.com
Tue Oct 20 03:40:09 CDT 2020
El jue., 15 oct. 2020 a las 23:32, Barry Smith (<bsmith at petsc.dev>)
escribió:
>
> We still have the assumption the AIJ matrix always has a copy on the
> GPU. How did you fill up the matrix on the GPU while not having its copy
> on the CPU?
>
> My strategy here was to initialize the structure on the CPU with dummy
values to have the corresponding device arrays allocated. Ideally I would
have initialized the structure on a kernel as well, since my intention is
to keep all data on the GPU (and not hit host memory other than for
debugging). But since the topology of my problem remains constant over
time, this approach proved to be sufficient. I did not find any problem
with my use case so far.
One thing I couldn't figure out, though, is how to force PETSc to transfer
the data back to host. MatView always displays the dummy values I used for
initialization. Is there a function to do this?
Thanks for the replies, by the way! I'm quite surprised how responsive the
PETSc community is! :)
Cheers,
Héctor
> Barry
>
> When we remove this assumption we have to add a bunch more code for CPU
> only things to make sure they properly get the data from the GPU.
>
>
> On Oct 15, 2020, at 4:16 AM, Héctor Barreiro Cabrera <hecbarcab at gmail.com>
> wrote:
>
> Hello fellow PETSc users,
>
> Following up my previous email
> <https://lists.mcs.anl.gov/pipermail/petsc-users/2020-September/042511.html>,
> I managed to feed the entry data to a SeqAICUSPARSE matrix through a CUDA
> kernel using the new MatCUSPARSEGetDeviceMatWrite function (thanks Barry
> Smith and Mark Adams!). However, I am now facing problems when trying to
> use this matrix within a SNES solver with the Eisenstat-Walker method
> enabled.
>
> According to PETSc's error log, the preconditioner is failing to invert
> the matrix diagonal. Specifically it says that:
> [0]PETSC ERROR: Arguments are incompatible
> [0]PETSC ERROR: Zero diagonal on row 0
> [0]PETSC ERROR: Configure options PETSC_ARCH=win64_vs2019_release
> --with-cc="win32fe cl" --with-cxx="win32fe cl" --with-clanguage=C++
> --with-fc=0 --with-mpi=0 --with-cuda=1 --with-cudac="win32fe nvcc"
> --with-cuda-dir=~/cuda --download-f2cblaslapack=1 --with-precision=single
> --with-64-bit-indices=0 --with-single-library=1 --with-endian=little
> --with-debugging=0 --with-x=0 --with-windows-graphics=0
> --with-shared-libraries=1 --CUDAOPTFLAGS=-O2
>
> The stack trace leads to the diagonal inversion routine:
> [0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1913 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\impls\aij\seq\aij.c
> [0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1944 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\impls\aij\seq\aij.c
> [0]PETSC ERROR: #3 MatSOR() line 4005 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\INTERF~1\matrix.c
> [0]PETSC ERROR: #4 PCPreSolve_Eisenstat() line 79 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\pc\impls\eisens\eisen.c
> [0]PETSC ERROR: #5 PCPreSolve() line 1549 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\pc\INTERF~1\precon.c
> [0]PETSC ERROR: #6 KSPSolve_Private() line 686 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\ksp\INTERF~1\itfunc.c
> [0]PETSC ERROR: #7 KSPSolve() line 889 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\ksp\INTERF~1\itfunc.c
> [0]PETSC ERROR: #8 SNESSolve_NEWTONLS() line 225 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\snes\impls\ls\ls.c
> [0]PETSC ERROR: #9 SNESSolve() line 4567 in
> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\snes\INTERF~1\snes.c
>
> I am 100% positive that the diagonal does not contain a zero entry, so my
> suspicions are either that this operation is not supported on the GPU at
> all (MatInvertDiagonal_SeqAIJ seems to access host-side memory) or that I
> am missing some setting to make this work on the GPU. Is this correct?
>
> Thanks!
>
> Cheers,
> Héctor
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201020/1ff9f828/attachment.html>
More information about the petsc-users
mailing list