[petsc-users] Question regarding naming of fieldsplit splits
Barry Smith
bsmith at petsc.dev
Thu Aug 1 11:19:20 CDT 2024
The current code is nonsensical. We can "fix" it in a patch to the release branch (but the fix may break some current usage) by changing
if (nfields == 1) {
PetscCall(PCFieldSplitSetIS(pc, fieldNames[ifields[0]], compField));
} else {
PetscCall(PetscSNPrintf(splitname, sizeof(splitname), "%" PetscInt_FMT, i));
PetscCall(PCFieldSplitSetIS(pc, splitname, compField));
}
to
PetscCall(PetscSNPrintf(splitname, sizeof(splitname), "%" PetscInt_FMT, i));
PetscCall(PCFieldSplitSetIS(pc, splitname, compField));
but a "correct" fix will take some thought. The current model using a combination of some "inner" integer fieldnames and some outer fieldnames (which are whatever they are including possible integers) doesn't make any sense.
> On Aug 1, 2024, at 9:19 AM, Blauth, Sebastian <sebastian.blauth at itwm.fraunhofer.de> wrote:
>
> Hello everyone,
>
> I have a follow up on my question. I noticed the following behavior. Let’s assume I have 5 fields which I want to group with the following options:
>
> -ksp_type fgmres
> -ksp_max_it 1
> -ksp_monitor_true_residual
> -ksp_view
> -pc_type fieldsplit
> -pc_fieldsplit_type multiplicative
> -pc_fieldsplit_0_fields 0,1
> -pc_fieldsplit_1_fields 2
> -pc_fieldsplit_2_fields 3,4
> -fieldsplit_0_ksp_type preonly
> -fieldsplit_0_pc_type jacobi
> -fieldsplit_2_ksp_type preonly
> -fieldsplit_2_pc_type jacobi
>
> Then, the first split is fine, but both the second and third splits get the same prefix, i.e., “fieldsplit_2”. This is shown in the output of the ksp_view, which I attach below.
> The first one gets the prefix as there is only a single split (and I choose as name the index) and the third split gets the name as it groups two other fields, so the “outer” name is taken. Is there any way to circumvent this, other than using custom names for the splits which are unique?
>
> Thanks for your time and best regards,
> Sebastian Blauth
>
>
> The output of “ksp_view” is the following
>
> KSP Object: 1 MPI process
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=1, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: 1 MPI process
> type: fieldsplit
> FieldSplit with MULTIPLICATIVE composition: total splits = 3
> Solver info for each split is in the following KSP objects:
> Split number 0 Defined by IS
> KSP Object: (fieldsplit_0_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_0_) 1 MPI process
> type: jacobi
> type DIAGONAL
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_) 1 MPI process
> type: seqaij
> rows=243, cols=243
> total: nonzeros=4473, allocated nonzeros=4473
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 86 nodes, limit used is 5
> Split number 1 Defined by IS
> KSP Object: (fieldsplit_2_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_2_) 1 MPI process
> type: jacobi
> type DIAGONAL
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_2_) 1 MPI process
> type: seqaij
> rows=81, cols=81
> total: nonzeros=497, allocated nonzeros=497
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> Split number 2 Defined by IS
> KSP Object: (fieldsplit_2_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_2_) 1 MPI process
> type: jacobi
> type DIAGONAL
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_2_) 1 MPI process
> type: seqaij
> rows=243, cols=243
> total: nonzeros=4473, allocated nonzeros=4473
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 85 nodes, limit used is 5
> linear system matrix = precond matrix:
> Mat Object: 1 MPI process
> type: seqaij
> rows=567, cols=567
> total: nonzeros=24353, allocated nonzeros=24353
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 173 nodes, limit used is 5
>
> --
> Dr. Sebastian Blauth
> Fraunhofer-Institut für
> Techno- und Wirtschaftsmathematik ITWM
> Abteilung Transportvorgänge
> Fraunhofer-Platz 1, 67663 Kaiserslautern
> Telefon: +49 631 31600-4968
> sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>
> https://urldefense.us/v3/__https://www.itwm.fraunhofer.de__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KeQMTQeY$ <https://urldefense.us/v3/__https://www.itwm.fraunhofer.de/__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KRgl9os0$ >
>
> From: petsc-users <petsc-users-bounces at mcs.anl.gov <mailto:petsc-users-bounces at mcs.anl.gov>> On Behalf Of Blauth, Sebastian
> Sent: Tuesday, July 2, 2024 11:47 AM
> To: Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Question regarding naming of fieldsplit splits
>
> Hi Matt,
>
> thanks fort he answer and clarification. Then I’ll work around this issue in python, where I set the options.
>
> Best,
> Sebastian
>
> --
> Dr. Sebastian Blauth
> Fraunhofer-Institut für
> Techno- und Wirtschaftsmathematik ITWM
> Abteilung Transportvorgänge
> Fraunhofer-Platz 1, 67663 Kaiserslautern
> Telefon: +49 631 31600-4968
> sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>
> https://urldefense.us/v3/__https://www.itwm.fraunhofer.de__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KeQMTQeY$ <https://urldefense.us/v3/__https://www.itwm.fraunhofer.de/__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KRgl9os0$ >
>
> From: Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>>
> Sent: Monday, July 1, 2024 4:30 PM
> To: Blauth, Sebastian <sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Question regarding naming of fieldsplit splits
>
> On Mon, Jul 1, 2024 at 9:48 AM Blauth, Sebastian <sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>> wrote:
> Dear Matt,
>
> thanks a lot for your help. Unfortunately, for me these extra options do not have any effect, I still get the “u” and “p” fieldnames. Also, this would not help me to get rid of the “c” fieldname – on that level of the fieldsplit I am basically using your approach already, and still it does show up. The output of the -ksp_view is unchanged, so that I do not attach it here again. Maybe I misunderstood you?
>
> Oh, we make an exception for single fields, since we think you would want to use the name. I have to make an extra option to shut off naming.
>
> Thanks,
>
> Matt
>
> Thanks for the help and best regards,
> Sebastian
>
> --
> Dr. Sebastian Blauth
> Fraunhofer-Institut für
> Techno- und Wirtschaftsmathematik ITWM
> Abteilung Transportvorgänge
> Fraunhofer-Platz 1, 67663 Kaiserslautern
> Telefon: +49 631 31600-4968
> sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>
> https://urldefense.us/v3/__https://www.itwm.fraunhofer.de__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KeQMTQeY$ <https://urldefense.us/v3/__https://www.itwm.fraunhofer.de/__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KRgl9os0$ >
>
> From: Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>>
> Sent: Monday, July 1, 2024 2:27 PM
> To: Blauth, Sebastian <sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Question regarding naming of fieldsplit splits
>
> On Fri, Jun 28, 2024 at 4:05 AM Blauth, Sebastian <sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>> wrote:
> Hello everyone,
>
> I have a question regarding the naming convention using PETSc’s PCFieldsplit. I have been following https://urldefense.us/v3/__https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037262.html__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KSDRPkj4$ to create a DMShell with FEniCS in order to customize PCFieldsplit for my application.
> I am using the following options, which work nicely for me:
>
> -ksp_type fgmres
> -pc_type fieldsplit
> -pc_fieldsplit_0_fields 0, 1
> -pc_fieldsplit_1_fields 2
> -pc_fieldsplit_type additive
> -fieldsplit_0_ksp_type fgmres
> -fieldsplit_0_pc_type fieldsplit
> -fieldsplit_0_pc_fieldsplit_type schur
> -fieldsplit_0_pc_fieldsplit_schur_fact_type full
> -fieldsplit_0_pc_fieldsplit_schur_precondition selfp
> -fieldsplit_0_fieldsplit_u_ksp_type preonly
> -fieldsplit_0_fieldsplit_u_pc_type lu
> -fieldsplit_0_fieldsplit_p_ksp_type cg
> -fieldsplit_0_fieldsplit_p_ksp_rtol 1e-14
> -fieldsplit_0_fieldsplit_p_ksp_atol 1e-30
> -fieldsplit_0_fieldsplit_p_pc_type icc
> -fieldsplit_0_ksp_rtol 1e-14
> -fieldsplit_0_ksp_atol 1e-30
> -fieldsplit_0_ksp_monitor_true_residual
> -fieldsplit_c_ksp_type preonly
> -fieldsplit_c_pc_type lu
> -ksp_view
>
> By default, we use the field names, but you can prevent this by specifying the fields by hand, so
>
> -fieldsplit_0_pc_fieldsplit_0_fields 0
> -fieldsplit_0_pc_fieldsplit_1_fields 1
>
> should remove the 'u' and 'p' fieldnames. It is somewhat hacky, but I think easier to remember than
> some extra option.
>
> Thanks,
>
> Matt
>
> Note that this is just an academic example (sorry for the low solver tolerances) to test the approach, consisting of a Stokes equation and some concentration equation (which is not even coupled to Stokes, just for testing).
> Completely analogous to https://urldefense.us/v3/__https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037262.html__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KSDRPkj4$ , I translate my IS’s to a PETSc Section, which is then supplied to a DMShell and assigned to a KSP. I am not so familiar with the code or how / why this works, but it seems to do so perfectly. I name my sections with petsc4py using
>
> section.setFieldName(0, "u")
> section.setFieldName(1, "p")
> section.setFieldName(2, "c")
>
> However, this is also reflected in the way I can access the fieldsplit options from the command line. My question is: Is there any way of not using the FieldNames specified in python but use the index of the field as defined with “-pc_fieldsplit_0_fields 0, 1” and “-pc_fieldsplit_1_fields 2”, i.e., instead of the prefix “fieldsplit_0_fieldsplit_u” I want to write “fieldsplit_0_fieldsplit_0”, instead of “fieldsplit_0_fieldsplit_p” I want to use “fieldsplit_0_fieldsplit_1”, and instead of “fieldsplit_c” I want to use “fieldsplit_1”. Just changing the names of the fields to
>
> section.setFieldName(0, "0")
> section.setFieldName(1, "1")
> section.setFieldName(2, "2")
>
> does obviously not work as expected, as it works for velocity and pressure, but not for the concentration – the prefix there is then “fieldsplit_2” and not “fieldsplit_1”. In the docs, I have found https://urldefense.us/v3/__https://petsc.org/main/manualpages/PC/PCFieldSplitSetFields/__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KbWFmeH0$ which seems to suggest that the fieldname can potentially be supplied, but I don’t see how to do so from the command line. Also, for the sake of completeness, I attach the output of the solve with “-ksp_view” below.
>
> Thanks a lot in advance and best regards,
> Sebastian
>
>
> The output of ksp_view is the following:
> KSP Object: 1 MPI processes
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-11, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
> type: fieldsplit
> FieldSplit with ADDITIVE composition: total splits = 2
> Solver info for each split is in the following KSP objects:
> Split number 0 Defined by IS
> KSP Object: (fieldsplit_0_) 1 MPI processes
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-14, absolute=1e-30, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: (fieldsplit_0_) 1 MPI processes
> type: fieldsplit
> FieldSplit with Schur preconditioner, factorization FULL
> Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses A00's diagonal's inverse
> Split info:
> Split number 0 Defined by IS
> Split number 1 Defined by IS
> KSP solver for A00 block
> KSP Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: lu
> out-of-place factorization
> tolerance for zero pivot 2.22045e-14
> matrix ordering: nd
> factor fill ratio given 5., needed 3.92639
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=4290, cols=4290
> package used to perform factorization: petsc
> total: nonzeros=375944, allocated nonzeros=375944
> using I-node routines: found 2548 nodes, limit used is 5
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: seqaij
> rows=4290, cols=4290
> total: nonzeros=95748, allocated nonzeros=95748
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 3287 nodes, limit used is 5
> KSP solver for S = A11 - A10 inv(A00) A01
> KSP Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
> type: cg
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-14, absolute=1e-30, divergence=10000.
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
> type: icc
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using Manteuffel shift [POSITIVE_DEFINITE]
> matrix ordering: natural
> factor fill ratio given 1., needed 1.
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqsbaij
> rows=561, cols=561
> package used to perform factorization: petsc
> total: nonzeros=5120, allocated nonzeros=5120
> block size is 1
> linear system matrix followed by preconditioner matrix:
> Mat Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
> type: schurcomplement
> rows=561, cols=561
> Schur complement A11 - A10 inv(A00) A01
> A11
> Mat Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
> type: seqaij
> rows=561, cols=561
> total: nonzeros=3729, allocated nonzeros=3729
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> A10
> Mat Object: 1 MPI processes
> type: seqaij
> rows=561, cols=4290
> total: nonzeros=19938, allocated nonzeros=19938
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> KSP of A00
> KSP Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: lu
> out-of-place factorization
> tolerance for zero pivot 2.22045e-14
> matrix ordering: nd
> factor fill ratio given 5., needed 3.92639
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=4290, cols=4290
> package used to perform factorization: petsc
> total: nonzeros=375944, allocated nonzeros=375944
> using I-node routines: found 2548 nodes, limit used is 5
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
> type: seqaij
> rows=4290, cols=4290
> total: nonzeros=95748, allocated nonzeros=95748
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 3287 nodes, limit used is 5
> A01
> Mat Object: 1 MPI processes
> type: seqaij
> rows=4290, cols=561
> total: nonzeros=19938, allocated nonzeros=19938
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 3287 nodes, limit used is 5
> Mat Object: 1 MPI processes
> type: seqaij
> rows=561, cols=561
> total: nonzeros=9679, allocated nonzeros=9679
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_) 1 MPI processes
> type: seqaij
> rows=4851, cols=4851
> total: nonzeros=139353, allocated nonzeros=139353
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 3830 nodes, limit used is 5
> Split number 1 Defined by IS
> KSP Object: (fieldsplit_c_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_c_) 1 MPI processes
> type: lu
> out-of-place factorization
> tolerance for zero pivot 2.22045e-14
> matrix ordering: nd
> factor fill ratio given 5., needed 4.24323
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=561, cols=561
> package used to perform factorization: petsc
> total: nonzeros=15823, allocated nonzeros=15823
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_c_) 1 MPI processes
> type: seqaij
> rows=561, cols=561
> total: nonzeros=3729, allocated nonzeros=3729
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=5412, cols=5412
> total: nonzeros=190416, allocated nonzeros=190416
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 3833 nodes, limit used is 5
>
> --
> Dr. Sebastian Blauth
> Fraunhofer-Institut für
> Techno- und Wirtschaftsmathematik ITWM
> Abteilung Transportvorgänge
> Fraunhofer-Platz 1, 67663 Kaiserslautern
> Telefon: +49 631 31600-4968
> sebastian.blauth at itwm.fraunhofer.de <mailto:sebastian.blauth at itwm.fraunhofer.de>
> https://urldefense.us/v3/__https://www.itwm.fraunhofer.de__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KeQMTQeY$ <https://urldefense.us/v3/__https://www.itwm.fraunhofer.de/__;!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KRgl9os0$ >
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KAT1VwKQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68Kgfz0DbI$ >
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68KAT1VwKQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cAFFX-2D5mPl2LyzZdzgpGK1EsZCSss_e1OpkYmPPKSWI9R6M4qPL0ghruqbMv6bIKAYbSdHtCXmL68Kgfz0DbI$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240801/a75bacae/attachment-0001.html>
More information about the petsc-users
mailing list