[petsc-dev] -with-kokkos-cuda-arch=AMPERE80 nonsense

Satish Balay balay at mcs.anl.gov
Mon Apr 5 15:19:39 CDT 2021


This is nvidia mess-up. Why isn't there a command that give me these values [if they insist on this interface for nvcc]

I see Barry want configure to do something here - but whatever we do - we would be shifting the problem around.
[even if we detect stuff - build box might not have the GPU used for runs.]

We have --with-cuda-arch - which I tried to remove from configure - but its come back in a different form (--with-cuda-gencodearch)

And I see other packages:

  --with-kokkos-cuda-arch

Wrt spack - I'm having to do:

spack install xsdk+cuda ^magma cuda_arch=60

[magma uses CudaPackage() infrastructure in spack]

Satish

On Mon, 5 Apr 2021, Mills, Richard Tran via petsc-dev wrote:

> You raise a good point, Barry. I've been completely mystified by what some of these names even mean. What does "PASCAL60" vs. "PASCAL61" even mean? Do you know of where this is even documented? I can't really find anything about it in the Kokkos documentation. The only thing I can really find is an issue or two about "hey, shouldn't our CMake stuff figure this out automatically" and then some posts about why it can't really do that. Not encouraging.
> 
> --Richard
> 
> On 4/3/21 8:42 PM, Barry Smith wrote:
> 
> 
>   It would be very nice to NOT require PETSc users to provide this flag, how the heck will they know what it should be when we cannot automate it ourselves?
> 
>   Any ideas of how this can be determined based on the current system? NVIDIA does not help since these "advertising" names don't seem to trivially map to information you can get from a particular GPU when you logged into it. For example nvidia-smi doesn't use these names directly. Is there some mapping from nvidia-smi  to these names we could use? If we are serious about having a non-trivial number of users utilizing GPUs, which we need to be for future, we cannot have this absurd demands in our installation process.
> 
>   Barry
> 
> Does spack have some magic for this we could use?
> 
> 
> 
> 



More information about the petsc-dev mailing list