[petsc-users] Problem setting Fieldsplit fields

Nicolas Barnafi nbarnafi at cmm.uchile.cl
Tue Feb 14 07:36:34 CST 2023


Hello Matt,

After some discussions elsewhere (thanks @LMitchell!), we found out that 
the problem is that the fields are setup with PCSetIS, without an 
attached DM, which does not support this kind of nesting fields.

I would like to add this feature, meaning that during the setup of the 
preconditioner there should be an additional routine when there is no dm 
that reads _%i_fields and sets the corresponding fields to the sub PCs, 
in some order.

My plan would be to do so at the PCSetUp_FieldSplit level. The idea is 
that whenever some IS fields are set such as 'a' and 'b', it should be 
possible to rearrange them with '-pc_fieldsplit_0_fields a,b' , or at 
least support this with numbered fields.

How do you see it?

Best,
NB

On 06/02/23 17:57, Matthew Knepley wrote:
> On Mon, Feb 6, 2023 at 11:45 AM Nicolas Barnafi 
> <nbarnafi at cmm.uchile.cl> wrote:
>
>     Thank you Matt,
>
>     Again, at the bottom of this message you will find the -info
>     output. I don't see any output related to the fields,
>
>
> If the splits were done automatically, you would see an info message 
> from here:
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L1595
>
> Thus it must be setup here
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L380
>
> There are info statements, but you do not see them, I do not see a way 
> around using a small example
> to understand how you are setting up the system, since it is working 
> as expected in the PETSc examples.
>
>   Thanks,
>
>       Matt
>
>     Best
>
>
>     ------ -info
>
>     [0] <sys> PetscDetermineInitialFPTrap(): Floating point trapping
>     is on by default 13
>     [0] <sys> PetscDeviceInitializeTypeFromOptions_Private():
>     PetscDeviceType host available, initializing
>     [0] <sys> PetscDeviceInitializeTypeFromOptions_Private():
>     PetscDevice host initialized, default device id 0, view FALSE,
>     init type lazy
>     [0] <sys> PetscDeviceInitializeTypeFromOptions_Private():
>     PetscDeviceType cuda not available
>     [0] <sys> PetscDeviceInitializeTypeFromOptions_Private():
>     PetscDeviceType hip not available
>     [0] <sys> PetscDeviceInitializeTypeFromOptions_Private():
>     PetscDeviceType sycl not available
>     [0] <sys> PetscInitialize_Common(): PETSc successfully started:
>     number of processors = 1
>     [0] <sys> PetscGetHostName(): Rejecting domainname, likely is NIS
>     nico-santech.(none)
>     [0] <sys> PetscInitialize_Common(): Running on machine: nico-santech
>     [0] <sys> SlepcInitialize(): SLEPc successfully started
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770087780768
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066936960 is being unlinked from inner PETSc comm 94770087780768
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770087780768
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770087780768
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066936960 is being unlinked from inner PETSc comm 94770087780768
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770087780768
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770087780768
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066936960 is being unlinked from inner PETSc comm 94770087780768
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770087780768
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770087780768
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066936960 is being unlinked from inner PETSc comm 94770087780768
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770087780768
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770087780768
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066936960 is being unlinked from inner PETSc comm 94770087780768
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770087780768
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066936960 94770087780768 max tags = 2147483647
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 1219 X 1219;
>     storage space: 0 unneeded,26443 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 150
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 1219) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 1160 nodes out of 1219
>     rows. Not using Inode routines
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066936960 94770087780768
>     [0] <sys> PetscGetHostName(): Rejecting domainname, likely is NIS
>     nico-santech.(none)
>     [0] <pc> PCSetUp(): Setting up PC for first time
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 615 X 615; storage
>     space: 0 unneeded,9213 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 117
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 615) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 561 nodes out of 615 rows.
>     Not using Inode routines
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066934048 94770110251424 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770110251424
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066934048 is being unlinked from inner PETSc comm 94770110251424
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770110251424
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770110251424
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 64 X 64; storage
>     space: 0 unneeded,0 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     64)/(num_localrows 64) > 0.6. Use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 13 nodes of 64. Limit used:
>     5. Using Inode routines
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066934048 94770100861088 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770100861088
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066934048 is being unlinked from inner PETSc comm 94770100861088
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770100861088
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770100861088
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 240 X 240; storage
>     space: 0 unneeded,2140 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 11
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 240) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 235 nodes out of 240 rows.
>     Not using Inode routines
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066934048 94770100861088 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770100861088
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066934048 is being unlinked from inner PETSc comm 94770100861088
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770100861088
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770100861088
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 300 X 300; storage
>     space: 0 unneeded,2292 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 33
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 300) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 300 nodes out of 300 rows.
>     Not using Inode routines
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066934048 94770100861088 max tags = 2147483647
>     [0] <sys> Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
>     PETSc communicator embedded in a user MPI_Comm 94770100861088
>     [0] <sys> Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
>     94770066934048 is being unlinked from inner PETSc comm 94770100861088
>     [0] <sys> PetscCommDestroy(): Deleting PETSc MPI_Comm 94770100861088
>     [0] <sys> Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
>     an MPI_Comm 94770100861088
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 615 X 1219;
>     storage space: 0 unneeded,11202 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 150
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 615) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 561 nodes out of 615 rows.
>     Not using Inode routines
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 64 X 1219; storage
>     space: 0 unneeded,288 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 6
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 64) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 64 nodes out of 64 rows.
>     Not using Inode routines
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 240 X 1219;
>     storage space: 0 unneeded,8800 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 78
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 240) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 235 nodes out of 240 rows.
>     Not using Inode routines
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Matrix size: 300 X 1219;
>     storage space: 0 unneeded,6153 used
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Number of mallocs during
>     MatSetValues() is 0
>     [0] <mat> MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 89
>     [0] <mat> MatCheckCompressedRow(): Found the ratio (num_zerorows
>     0)/(num_localrows 300) < 0.6. Do not use CompressedRow routines.
>     [0] <mat> MatSeqAIJCheckInode(): Found 300 nodes out of 300 rows.
>     Not using Inode routines
>     [0] <sys> PetscCommDuplicate(): Duplicating a communicator
>     94770066934048 94770100861088 max tags = 2147483647
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>       0 KSP Residual norm 2.541418258630e+01
>     [0] <ksp> KSPConvergedDefault(): user has provided nonzero initial
>     guess, computing 2-norm of RHS
>     [0] <pc> PCSetUp(): Leaving PC with identical preconditioner since
>     operator is unchanged
>     [0] <pc> PCSetUp(): Leaving PC with identical preconditioner since
>     operator is unchanged
>     [0] <pc> PCSetUp(): Setting up PC for first time
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <pc> PCSetUp(): Leaving PC with identical preconditioner since
>     operator is unchanged
>     [0] <pc> PCSetUp(): Setting up PC for first time
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 94770100861088
>     [0] <sys> PetscCommDuplicate(): Using internal PETSc communicator
>     94770066934048 947701008610882
>
>
>     On 03/02/23 21:11, Matthew Knepley wrote:
>>     On Fri, Feb 3, 2023 at 3:03 PM Nicolas Barnafi
>>     <nbarnafi at cmm.uchile.cl> wrote:
>>
>>         > There are a number of common errors:
>>         >
>>         >    1) Your PC has a prefix
>>         >
>>         >    2) You have not called KSPSetFromOptions() here
>>         >
>>         > Can you send the -ksp_view output?
>>
>>         The PC at least has no prefix. I had to set ksp_rtol to 1 to
>>         get through
>>         the solution process, you will find both the petsc_rc and the
>>         ksp_view
>>         at the bottom of this message.
>>
>>         Options are indeed being set from the options file, but there
>>         must be
>>         something missing at a certain level. Thanks for looking into
>>         this.
>>
>>
>>     Okay, the next step is to pass
>>
>>       -info
>>
>>     and send the output. This will tell us how the default splits
>>     were done. If that
>>     is not conclusive, we will have to use the debugger.
>>
>>       Thanks,
>>
>>          Matt
>>
>>         Best
>>
>>         ---- petsc_rc file
>>
>>         -ksp_monitor
>>         -ksp_type gmres
>>         -ksp_view
>>         -mat_type aij
>>         -ksp_norm_type unpreconditioned
>>         -ksp_atol 1e-14
>>         -ksp_rtol 1
>>         -pc_type fieldsplit
>>         -pc_fieldsplit_type multiplicative
>>
>>         ---- ksp_view
>>
>>         KSP Object: 1 MPI process
>>            type: gmres
>>              restart=500, using Classical (unmodified) Gram-Schmidt
>>         Orthogonalization with no iterative refinement
>>              happy breakdown tolerance 1e-30
>>            maximum iterations=10000, nonzero initial guess
>>            tolerances:  relative=1., absolute=1e-14, divergence=10000.
>>            right preconditioning
>>            using UNPRECONDITIONED norm type for convergence test
>>         PC Object: 1 MPI process
>>            type: fieldsplit
>>              FieldSplit with MULTIPLICATIVE composition: total splits = 4
>>              Solver info for each split is in the following KSP objects:
>>            Split number 0 Defined by IS
>>            KSP Object: (fieldsplit_0_) 1 MPI process
>>              type: preonly
>>              maximum iterations=10000, initial guess is zero
>>              tolerances:  relative=1e-05, absolute=1e-50,
>>         divergence=10000.
>>              left preconditioning
>>              using DEFAULT norm type for convergence test
>>            PC Object: (fieldsplit_0_) 1 MPI process
>>              type: ilu
>>              PC has not been set up so information may be incomplete
>>                out-of-place factorization
>>                0 levels of fill
>>                tolerance for zero pivot 2.22045e-14
>>                matrix ordering: natural
>>                matrix solver type: petsc
>>                matrix not yet factored; no additional information
>>         available
>>              linear system matrix = precond matrix:
>>              Mat Object: (fieldsplit_0_) 1 MPI process
>>                type: seqaij
>>                rows=615, cols=615
>>                total: nonzeros=9213, allocated nonzeros=9213
>>                total number of mallocs used during MatSetValues calls=0
>>                  not using I-node routines
>>            Split number 1 Defined by IS
>>            KSP Object: (fieldsplit_1_) 1 MPI process
>>              type: preonly
>>              maximum iterations=10000, initial guess is zero
>>              tolerances:  relative=1e-05, absolute=1e-50,
>>         divergence=10000.
>>              left preconditioning
>>              using DEFAULT norm type for convergence test
>>            PC Object: (fieldsplit_1_) 1 MPI process
>>              type: ilu
>>              PC has not been set up so information may be incomplete
>>                out-of-place factorization
>>                0 levels of fill
>>                tolerance for zero pivot 2.22045e-14
>>                matrix ordering: natural
>>                matrix solver type: petsc
>>                matrix not yet factored; no additional information
>>         available
>>              linear system matrix = precond matrix:
>>              Mat Object: (fieldsplit_1_) 1 MPI process
>>                type: seqaij
>>                rows=64, cols=64
>>                total: nonzeros=0, allocated nonzeros=0
>>                total number of mallocs used during MatSetValues calls=0
>>                  using I-node routines: found 13 nodes, limit used is 5
>>            Split number 2 Defined by IS
>>            KSP Object: (fieldsplit_2_) 1 MPI process
>>              type: preonly
>>              maximum iterations=10000, initial guess is zero
>>              tolerances:  relative=1e-05, absolute=1e-50,
>>         divergence=10000.
>>              left preconditioning
>>              using DEFAULT norm type for convergence test
>>            PC Object: (fieldsplit_2_) 1 MPI process
>>              type: ilu
>>              PC has not been set up so information may be incomplete
>>                out-of-place factorization
>>                0 levels of fill
>>                tolerance for zero pivot 2.22045e-14
>>                matrix ordering: natural
>>                matrix solver type: petsc
>>                matrix not yet factored; no additional information
>>         available
>>              linear system matrix = precond matrix:
>>              Mat Object: (fieldsplit_2_) 1 MPI process
>>                type: seqaij
>>                rows=240, cols=240
>>                total: nonzeros=2140, allocated nonzeros=2140
>>                total number of mallocs used during MatSetValues calls=0
>>                  not using I-node routines
>>            Split number 3 Defined by IS
>>            KSP Object: (fieldsplit_3_) 1 MPI process
>>              type: preonly
>>              maximum iterations=10000, initial guess is zero
>>              tolerances:  relative=1e-05, absolute=1e-50,
>>         divergence=10000.
>>              left preconditioning
>>              using DEFAULT norm type for convergence test
>>            PC Object: (fieldsplit_3_) 1 MPI process
>>              type: ilu
>>              PC has not been set up so information may be incomplete
>>                out-of-place factorization
>>                0 levels of fill
>>                tolerance for zero pivot 2.22045e-14
>>                matrix ordering: natural
>>                matrix solver type: petsc
>>                matrix not yet factored; no additional information
>>         available
>>              linear system matrix = precond matrix:
>>              Mat Object: (fieldsplit_3_) 1 MPI process
>>                type: seqaij
>>                rows=300, cols=300
>>                total: nonzeros=2292, allocated nonzeros=2292
>>                total number of mallocs used during MatSetValues calls=0
>>                  not using I-node routines
>>            linear system matrix = precond matrix:
>>            Mat Object: 1 MPI process
>>              type: seqaij
>>              rows=1219, cols=1219
>>              total: nonzeros=26443, allocated nonzeros=26443
>>              total number of mallocs used during MatSetValues calls=0
>>                not using I-node routines
>>                       solving time: 0.00449609
>>                         iterations: 0
>>                    estimated error: 25.4142
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.cse.buffalo.edu/~knepley/>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230214/1fcb9615/attachment-0001.html>


More information about the petsc-users mailing list