[petsc-users] [EXTERNAL] Re: Running PETSc with a Kokkos backend on OLCF Frontier
Junchao Zhang
junchao.zhang at gmail.com
Fri Jun 14 14:58:22 CDT 2024
Arpan,
Nice to meet you. -log_view in a petsc option, so I think you can add it
to your fvSolution_petsc_pKok_Uof at location like
mat_type mpiaijkokkos;
vec_type kokkos;
log_view;
--Junchao Zhang
On Fri, Jun 14, 2024 at 2:44 PM Sircar, Arpan <sircara at ornl.gov> wrote:
> Hi Junchao and Barry,
>
> I tried adding -log_view to my OpenFOAM command since that is what I am
> running the entire package through. However, it does not recognize that as
> a valid option. I am not sure where to put that tag in this setup.
> I am however using a Kokkos profiling tool, the output of which (for the
> GPU run) is attached with this email. Do let me know if this is useful or
> if you have ideas of where to put the -log_view tag.
>
> Junchao - Great nice running into you here.
>
> Thanks,
> Arpan
> ------------------------------
> *From:* Junchao Zhang <junchao.zhang at gmail.com>
> *Sent:* Friday, June 14, 2024 3:16 PM
> *To:* Sircar, Arpan <sircara at ornl.gov>
> *Cc:* Barry Smith <bsmith at petsc.dev>; petsc-users at mcs.anl.gov <
> petsc-users at mcs.anl.gov>; Gottiparthi, Kalyan <gottiparthik at ornl.gov>
> *Subject:* Re: [petsc-users] [EXTERNAL] Re: Running PETSc with a Kokkos
> backend on OLCF Frontier
>
> Arpan,
> Did you add -log_view ?
>
> --Junchao Zhang
>
>
> On Fri, Jun 14, 2024 at 2:00 PM Sircar, Arpan via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Hi Barry, Thanks for your prompt response. These are run with with the
> same PETSc solvers but the one on GPUs (log_pKok) uses mataijkokkos while
> the other one does not. Please let me know if you need any other
> information. Thanks, Arpan From:
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>
> ZjQcmQRYFpfptBannerEnd
> Hi Barry,
>
> Thanks for your prompt response. These are run with with the same PETSc
> solvers but the one on GPUs (log_pKok) uses mataijkokkos while the other
> one does not.
>
> Please let me know if you need any other information.
>
> Thanks,
> Arpan
> ------------------------------
> *From:* Barry Smith <bsmith at petsc.dev>
> *Sent:* Friday, June 14, 2024 1:47 PM
> *To:* Sircar, Arpan <sircara at ornl.gov>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>; Gottiparthi,
> Kalyan <gottiparthik at ornl.gov>
> *Subject:* [EXTERNAL] Re: [petsc-users] Running PETSc with a Kokkos
> backend on OLCF Frontier
>
>
> Please run both the CPU solvers and GPU solvers cases with -log_view
> and send the two outputs.
>
> Barry
>
>
> On Jun 14, 2024, at 1:35 PM, Sircar, Arpan via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> Hi,
>
> We have been working with OpenFOAM (an open-source CFD software) which can
> transfer its matrices to PETSc to use its linear solvers. This has been
> tested and is working well on OCLF's Frontier machine. Next we are trying
> to use the Kokkos backend to run it on Frontier GPUs. While the
> OpenFOAM+PETSc+Kokkos environment built correctly on Frontier using the
> modules sourced (attached file *bash_petsc4foam_gpu*) and configuring
> PETSc correctly (attached file *config_gpu*), the GPU solve seems to take
> more time than the CPU solve.
>
> The PETSc run-time options we are using are attached to this email (file
> *fvSolution_petsc_pKok_Uof*). Could you please take a look and let us
> know if this combination of options is fine? In this approach we are trying
> to solve the pressure equation only on the GPUs.
>
> Thanks,
> Arpan
>
> *Arpan Sircar*
> R&D Associate Staff
> Thermal Hydraulics Group
> Nuclear Energy and Fuel Cycle Division
> *Oak Ridge National Laboratory*
>
> <fvSolution_petsc_pKok_Uof><bash_petsc4foam_gpu><config_gpu>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240614/c33c6e76/attachment-0001.html>
More information about the petsc-users
mailing list