[petsc-users] Inquiry about configuring PETSc with AOCL BLAS/LAPACK

Barry Smith bsmith at petsc.dev
Mon Aug 11 18:33:51 CDT 2025



> On Aug 11, 2025, at 5:26 PM, Yongzhong Li via petsc-users <petsc-users at mcs.anl.gov> wrote:
> 
> Thank you Matt!
> 
> It appears that the HIP implementation is designed for sparse matvec on AMD GPUs, similar to CUDA for Nvidia GPUs, is that correct?
>  
> We are primarily focusing on CPUs with AOCL. It would be great if PETSc offered an AOCL AIJ class. We could use AOCL in the same manner as we use MKL.
> 
> In the meantime, I wonder if I can simply use MATAIJ-type matrices and configure PETSc with AOCL during compilation. Will the MatMult() API then use AOCL BLAS as the backend?

   BLAS (and LAPACK) are for vectors and dense matrices so the quality (or lack of) of BLAS and LAPACK doesn't have any affect on the MATAIJ operations which are sparse matrix operations.

   So yes use AOCL BLAS and MATAIJ for your sparse matrices.

> 
> Thanks,
> Yongzhong
> 
>  
> 
>  
> From: Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>>
> Date: Monday, August 11, 2025 at 12:44 PM
> To: Yongzhong Li <yongzhong.li at mail.utoronto.ca <mailto:yongzhong.li at mail.utoronto.ca>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>, petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov> <petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>>, Jasper Hatton <jasper.hatton at mail.utoronto.ca <mailto:jasper.hatton at mail.utoronto.ca>>, Piero Triverio <piero.triverio at utoronto.ca <mailto:piero.triverio at utoronto.ca>>, Atacan Tuhan <a.tuhan at mail.utoronto.ca <mailto:a.tuhan at mail.utoronto.ca>>
> Subject: Re: [petsc-users] Inquiry about configuring PETSc with AOCL BLAS/LAPACK
> 
> On Mon, Aug 11, 2025 at 12:32 PM Yongzhong Li <yongzhong.li at mail.utoronto.ca <mailto:yongzhong.li at mail.utoronto.ca>> wrote:
> Dear PETSc’s developer,
> 
> Hi, I am a user of PETSc. I have some questions about how we can configure PETSc with AOCL BLAS and LAPACK.
>  
> Previously, we linked PETSc with Intel MKL BLAS. This solution provides us with much better multithreading capability for sparse matrix-vector product compared with configuring PETSc with OpenBLAS. Now, our compute nodes have been upgraded with AMD CPUs, we are considering switching from Intel MKL to AMD AOCL.
> 
> My questions are:
>  
> If we configure PETSc in compile time with —with-blaslapack-dir = $AOCLROOT, will we be able to use AOCL BLAS as the backend of PETSc MatMult() API?
>  
> Do you mean use the AMD sparse matvec? We have a HIP implementation (https://urldefense.us/v3/__https://petsc.org/main/manualpages/Mat/MATAIJHIPSPARSE/__;!!G_uCfscf7eWS!eyVo6h2c-IbELNug6c05khR0jl7oQV3tAilAdQ7grI3_V2I9VXubG17fp0RKIcLhP1sVbNxZOr_mdLuzo1a6fi0$  <https://urldefense.us/v3/__https://petsc.org/main/manualpages/Mat/MATAIJHIPSPARSE/__;!!G_uCfscf7eWS!erObo4bZtlWw2VuQgcNv1mF6FQ0629E3AUF29xJ1MC425B_G8NL2xhSbiULNL58ZGQcMnQWy85kVeTiFbTeeREz4Ubko4R7i9vQ$>), but nothing for AOCL comparable to the AIJMKL class. If you think we need it, we would certainly help implement it.
>  
>  
> What if AOCL BLAS and AOCL LAPACK are installed in two different directories, not under AOCLROOT?
>  
> You would use -with-blaslapack-lib=[liblist]
>   
>  
> PETSc has MatAIJMKL type for sparse matrix stored in Intel MKL format. Does PETSc also have another type for AMD AOCL?
>  
> No, but it would be straightforward to add.
>  
>   Thanks,
>  
>      Matt
>  
> Thanks!
> Yongzhong
> 
>  
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>  
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!eyVo6h2c-IbELNug6c05khR0jl7oQV3tAilAdQ7grI3_V2I9VXubG17fp0RKIcLhP1sVbNxZOr_mdLuzM-9yasU$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!erObo4bZtlWw2VuQgcNv1mF6FQ0629E3AUF29xJ1MC425B_G8NL2xhSbiULNL58ZGQcMnQWy85kVeTiFbTeeREz4Ubko--hqJ1I$>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250811/2d43889a/attachment-0001.html>


More information about the petsc-users mailing list