[petsc-users] Auto sparsity detection?
Zou, Ling
lzou at anl.gov
Sat Jan 18 10:48:48 CST 2025
Thank you, both Hong and Matt.
-Ling
From: Zhang, Hong <hongzhang at anl.gov>
Date: Friday, January 17, 2025 at 12:34 PM
To: Matthew Knepley <knepley at gmail.com>, Zou, Ling <lzou at anl.gov>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Auto sparsity detection?
We have an example in src/ts/tutorials/autodiff on using AD for reaction-diffusion equations. It does exactly what Matt said - differentiating the stencil kernel to get the Jacobian kernel. More information is available in this report: https://urldefense.us/v3/__https://arxiv.org/abs/1909.02836__;!!G_uCfscf7eWS!YAjJAY_-Po-R_8afPXjkPNIMrtGRpuOyrIvGq4kmS_GIk9E44GkBWYmWA38JPV_6BtwsHl2enTCgdAxzrgQ$
Hong (Mr.)
________________________________
From: petsc-users <petsc-users-bounces at mcs.anl.gov> on behalf of Matthew Knepley <knepley at gmail.com>
Sent: Friday, January 17, 2025 6:22 AM
To: Zou, Ling <lzou at anl.gov>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Auto sparsity detection?
On Thu, Jan 16, 2025 at 10:43 PM Zou, Ling <lzou at anl.gov<mailto:lzou at anl.gov>> wrote:
Thank you, Matt.
Seems that at least the matrix coloring part I am following the ‘best practice’.
Yes, for FD approximations of the Jacobian.
If you have a stencil operation (like FEM or FVM), then AD can be very useful because you
only have to differentiate the kernel to get the Jacobian kernel.
Thanks,
Matt
-Ling
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Date: Thursday, January 16, 2025 at 9:01 PM
To: Zou, Ling <lzou at anl.gov<mailto:lzou at anl.gov>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Auto sparsity detection?
On Thu, Jan 16, 2025 at 9: 50 PM Zou, Ling via petsc-users <petsc-users@ mcs. anl. gov> wrote: Hi all, Does PETSc has some automatic matrix sparsity detection algorithm available? Something like: https: //docs. sciml. ai/NonlinearSolve/stable/basics/sparsity_detection/
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
On Thu, Jan 16, 2025 at 9:50 PM Zou, Ling via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hi all,
Does PETSc has some automatic matrix sparsity detection algorithm available?
Something like: https://urldefense.us/v3/__https://docs.sciml.ai/NonlinearSolve/stable/basics/sparsity_detection/__;!!G_uCfscf7eWS!YAjJAY_-Po-R_8afPXjkPNIMrtGRpuOyrIvGq4kmS_GIk9E44GkBWYmWA38JPV_6BtwsHl2enTCgV8Scdmw$ <https://urldefense.us/v3/__https:/docs.sciml.ai/NonlinearSolve/stable/basics/sparsity_detection/__;!!G_uCfscf7eWS!ccEx6zmuNrVADqtN50hO2N0k4Qs-A70nztAjMLu-JElnjhK5w84BpYC8CAINd6KihSxaS2rx_LgpqUVM49U$>
Sparsity detection would rely on introspection of the user code for ComputeFunction(), which is not
possible in C (unless you were to code up your evaluation in some symbolic framework).
The background is that I use finite differencing plus matrix coloring to (efficiently) get the Jacobian.
For the matrix coloring part, I color the matrix based on mesh connectivity and variable dependencies, which is not bad, but just try to be lazy to even eliminating this part.
This is how the automatic frameworks also work. This is how we compute the sparsity pattern for PetscFE and PetscFV.
A related but different question, how much does PETSc support automatic differentiation?
I see some old paper:
https://ftp.mcs.anl.gov/pub/tech_reports/reports/P922.pdf
and discussion in the roadmap:
https://urldefense.us/v3/__https://petsc.org/release/community/roadmap/__;!!G_uCfscf7eWS!YAjJAY_-Po-R_8afPXjkPNIMrtGRpuOyrIvGq4kmS_GIk9E44GkBWYmWA38JPV_6BtwsHl2enTCgModP60M$ <https://urldefense.us/v3/__https:/petsc.org/release/community/roadmap/__;!!G_uCfscf7eWS!ccEx6zmuNrVADqtN50hO2N0k4Qs-A70nztAjMLu-JElnjhK5w84BpYC8CAINd6KihSxaS2rx_Lgpw6v6hKE$>
I am thinking that if AD works so I don’t even need to do finite differencing Jacobian, or have it as another option.
Other people understand that better than I do.
Thanks,
Matt
Best,
-Ling
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YAjJAY_-Po-R_8afPXjkPNIMrtGRpuOyrIvGq4kmS_GIk9E44GkBWYmWA38JPV_6BtwsHl2enTCg4JqU_GI$ <https://urldefense.us/v3/__http:/www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!d-7O5V0pNvm_fDSKhNk_ilXP0jG-_MBnectBJ0BfVPOSzARXvYWAahGyRNf1cKCh9dJKEiFt2caV$>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YAjJAY_-Po-R_8afPXjkPNIMrtGRpuOyrIvGq4kmS_GIk9E44GkBWYmWA38JPV_6BtwsHl2enTCg4JqU_GI$ <https://urldefense.us/v3/__http:/www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cWyHnKq-Gzasz3ooIUAgTl0RTGrzg0fW8jwVOi0AHE_Ydv4dnayXiG06EPQYvp6guWhXYTv8DMnOu7xNNzJR$>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250118/859eee73/attachment-0001.html>
More information about the petsc-users
mailing list