[petsc-users] Eisenstat-Walker method with GPU assembled matrices

Stefano Zampini stefano.zampini at gmail.com
Tue Oct 20 06:52:10 CDT 2020


We currently do not have a transfer to host setup for cusparse. I have a
preliminary version here
https://gitlab.com/petsc/petsc/-/tree/stefanozampini/feature-mataij-create-fromcoo

Should be ready in a couple of days for review.

Il giorno mar 20 ott 2020 alle ore 14:37 Matthew Knepley <knepley at gmail.com>
ha scritto:

> On Tue, Oct 20, 2020 at 4:40 AM Héctor Barreiro Cabrera <
> hecbarcab at gmail.com> wrote:
>
>> El jue., 15 oct. 2020 a las 23:32, Barry Smith (<bsmith at petsc.dev>)
>> escribió:
>>
>>>
>>>   We still have the assumption the AIJ matrix always has a copy on the
>>> GPU.  How did you fill up the matrix on the GPU while not having its copy
>>> on the CPU?
>>>
>>> My strategy here was to initialize the structure on the CPU with dummy
>> values to have the corresponding device arrays allocated. Ideally I would
>> have initialized the structure on a kernel as well, since my intention is
>> to keep all data on the GPU (and not hit host memory other than for
>> debugging). But since the topology of my problem remains constant over
>> time, this approach proved to be sufficient. I did not find any problem
>> with my use case so far.
>>
>> One thing I couldn't figure out, though, is how to force PETSc to
>> transfer the data back to host. MatView always displays the dummy values I
>> used for initialization. Is there a function to do this?
>>
>
> Hmm, this should happen automatically, so we have missed something. How do
> you change the values on the device?
>
>   Thanks,
>
>     Matt
>
>
>> Thanks for the replies, by the way! I'm quite surprised how responsive
>> the PETSc community is! :)
>>
>> Cheers,
>> Héctor
>>
>>
>>>   Barry
>>>
>>>   When we remove this assumption we have to add a bunch more code for
>>> CPU only things to make sure they properly get the data from the GPU.
>>>
>>>
>>> On Oct 15, 2020, at 4:16 AM, Héctor Barreiro Cabrera <
>>> hecbarcab at gmail.com> wrote:
>>>
>>> Hello fellow PETSc users,
>>>
>>> Following up my previous email
>>> <https://lists.mcs.anl.gov/pipermail/petsc-users/2020-September/042511.html>,
>>> I managed to feed the entry data to a SeqAICUSPARSE matrix through a CUDA
>>> kernel using the new MatCUSPARSEGetDeviceMatWrite function (thanks Barry
>>> Smith and Mark Adams!). However, I am now facing problems when trying to
>>> use this matrix within a SNES solver with the Eisenstat-Walker method
>>> enabled.
>>>
>>> According to PETSc's error log, the preconditioner is failing to invert
>>> the matrix diagonal. Specifically it says that:
>>> [0]PETSC ERROR: Arguments are incompatible
>>> [0]PETSC ERROR: Zero diagonal on row 0
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=win64_vs2019_release
>>> --with-cc="win32fe cl" --with-cxx="win32fe cl" --with-clanguage=C++
>>> --with-fc=0 --with-mpi=0 --with-cuda=1 --with-cudac="win32fe nvcc"
>>> --with-cuda-dir=~/cuda --download-f2cblaslapack=1 --with-precision=single
>>> --with-64-bit-indices=0 --with-single-library=1 --with-endian=little
>>> --with-debugging=0 --with-x=0 --with-windows-graphics=0
>>> --with-shared-libraries=1 --CUDAOPTFLAGS=-O2
>>>
>>> The stack trace leads to the diagonal inversion routine:
>>> [0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1913 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\impls\aij\seq\aij.c
>>> [0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1944 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\impls\aij\seq\aij.c
>>> [0]PETSC ERROR: #3 MatSOR() line 4005 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\mat\INTERF~1\matrix.c
>>> [0]PETSC ERROR: #4 PCPreSolve_Eisenstat() line 79 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\pc\impls\eisens\eisen.c
>>> [0]PETSC ERROR: #5 PCPreSolve() line 1549 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\pc\INTERF~1\precon.c
>>> [0]PETSC ERROR: #6 KSPSolve_Private() line 686 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\ksp\INTERF~1\itfunc.c
>>> [0]PETSC ERROR: #7 KSPSolve() line 889 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\ksp\ksp\INTERF~1\itfunc.c
>>> [0]PETSC ERROR: #8 SNESSolve_NEWTONLS() line 225 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\snes\impls\ls\ls.c
>>> [0]PETSC ERROR: #9 SNESSolve() line 4567 in
>>> C:\cygwin64\home\HBARRE~1\PETSC-~1\src\snes\INTERF~1\snes.c
>>>
>>> I am 100% positive that the diagonal does not contain a zero entry, so
>>> my suspicions are either that this operation is not supported on the GPU at
>>> all (MatInvertDiagonal_SeqAIJ seems to access host-side memory) or that I
>>> am missing some setting to make this work on the GPU. Is this correct?
>>>
>>> Thanks!
>>>
>>> Cheers,
>>> Héctor
>>>
>>>
>>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


-- 
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201020/4baa84c1/attachment-0001.html>


More information about the petsc-users mailing list