[petsc-users] Possible bug PETSc+Complex+CUDA
Junchao Zhang
junchao.zhang at gmail.com
Fri May 22 13:55:02 CDT 2020
$ module list
Currently Loaded Modules:
1) cuda/10.2 2) gcc/8.3.0-fjpc5ys 3) cmake/3.17.0-n3kslpc 4)
openmpi-4.0.2-gcc-8.3.0-e2zcbqz
$nvcc -V
Cuda compilation tools, release 10.2, V10.2.89
--Junchao Zhang
On Fri, May 22, 2020 at 12:57 PM Mills, Richard Tran via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Yes, Junchao said he gets the segfault, but it works for Karl. Sounds like
> this may be a case of one compiler liking the definitions for complex that
> Thrust uses, and some not, as Stefano says. Karl and Junchao, can you
> please share the version of the compilers (and maybe associated settings)
> that you are using?
>
> --Richard
>
> On 5/21/20 9:15 AM, Junchao Zhang wrote:
>
> I tested this example with cuda 10.2, it did segfault. I'm looking into it.
> --Junchao Zhang
>
>
> On Thu, May 21, 2020 at 11:04 AM Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Thu, May 21, 2020 at 11:31 AM Stefano Zampini <
>> stefano.zampini at gmail.com> wrote:
>>
>>> Oh, there is also an issue I have recently noticed and did not have yet
>>> the time to fix it
>>>
>>> With complex numbers, we use the definitions for complexes from thrust
>>> and this does not seem to be always compatible to whatever the C compiler
>>> uses
>>> Matt, take a look at petscsytypes.h and you will see the issue
>>> https://gitlab.com/petsc/petsc/-/blob/master/include/petscsystypes.h#L208
>>>
>>> For sure, you need to configure petsc with --with-clanguage=cxx, but
>>> even that does not to seem make it work on a CUDA box I have recently tried
>>> out (CUDA 10.1)
>>> I believe the issue arise even if you call VecSet(v,0) on a VECCUDA
>>>
>>
>> So Karl and Junchao say that with 10.2 it is working. Do you have access
>> to 10.2?
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> On May 21, 2020, at 6:21 PM, Matthew Knepley <knepley at gmail.com> wrote:
>>>
>>> On Thu, May 21, 2020 at 10:53 AM Rui Silva <rui.silva at uam.es> wrote:
>>>
>>>> Hello everyone,
>>>>
>>>> I am trying to run PETSc (with complex numbers in the GPU). When I call
>>>> the VecWAXPY routine using the complex version of PETSc and mpicuda
>>>> vectors, the program fails with a segmentation fault. This problem does
>>>> not appear, if I run the complex version with mpi vectors or with the
>>>> real version using mpicuda vectors. Is there any problem using
>>>> CUDA+complex PETSc?
>>>>
>>>> Furthermore, I use the -log_view option to run the complex+gpu code,
>>>> otherwise the program fails at the beggining.
>>>>
>>>
>>> What version of CUDA do you have? There are bugs in the versions before
>>> 10.2.
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>> Best regards,
>>>>
>>>> Rui Silva
>>>>
>>>> --
>>>> Dr. Rui Emanuel Ferreira da Silva
>>>> Departamento de Física Teórica de la Materia Condensada
>>>> Universidad Autónoma de Madrid, Spain
>>>> https://ruiefdasilva.wixsite.com/ruiefdasilva
>>>> https://mmuscles.eu/
>>>>
>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200522/4d085355/attachment.html>
More information about the petsc-users
mailing list