[petsc-users] Coordinate format internal reordering

Enrico degregori at dkrz.de
Thu Oct 19 12:46:04 CDT 2023


Sorry but I don't want another partition, Petsc internally is changing 
the partition. I would like to have the same partition that I have in 
the application. Is the example not clear?

On 19/10/2023 19:43, Matthew Knepley wrote:
> On Thu, Oct 19, 2023 at 1:00 PM Enrico <degregori at dkrz.de 
> <mailto:degregori at dkrz.de>> wrote:
> 
>     Here is a very very simple reproducer of my problem. It is a fortran
>     program and it has to run with 2 processes.
> 
> 
> You seem to be saying that you start with one partition of your data, 
> but you would like
> another partition. For this, you have to initially communicate. For this 
> I would use VecScatter.
> However, since most data is generated, I would consider not generating 
> my data in that initial
> distribution.
> 
> There are many examples in the repository. In the discretization of a 
> PDE, we first divide the domain,
> then number up each piece, then assemble the linear algebra objects.
> 
>    Thanks,
> 
>        Matt
> 
>     The output is:
> 
>        process             0 : xx_v(            1 ) =     0.000000000000000
>        process             0 : xx_v(            2 ) =     1.000000000000000
>        process             0 : xx_v(            3 ) =     2.000000000000000
>        process             1 : xx_v(            1 ) =     3.000000000000000
>        process             1 : xx_v(            2 ) =     4.000000000000000
>        process             1 : xx_v(            3 ) =     5.000000000000000
> 
>     and I would like to have:
> 
>        process             0 : xx_v(            1 ) =     2.000000000000000
>        process             0 : xx_v(            2 ) =     3.000000000000000
>        process             0 : xx_v(            3 ) =     4.000000000000000
>        process             1 : xx_v(            1 ) =     0.000000000000000
>        process             1 : xx_v(            2 ) =     1.000000000000000
>        process             1 : xx_v(            3 ) =     5.000000000000000
> 
>     How can I do that?
> 
>     program main
>     #include <petsc/finclude/petscksp.h>
>           use petscksp
>           implicit none
> 
>           PetscErrorCode ierr
>           PetscInt  :: Psize = 6
>           integer  :: Lsize
>           PetscInt  :: work_size
>           PetscInt  :: work_rank
>           Vec :: b
>           integer, allocatable, dimension(:) :: glb_index
>           double precision, allocatable, dimension(:) :: array
>           PetscScalar, pointer :: xx_v(:)
>           integer :: i
>           PetscCount :: csize
> 
>           CALL PetscInitialize(ierr)
> 
>           Lsize = 3
>           csize = Lsize
> 
>           allocate(glb_index(0:Lsize-1), array(0:Lsize-1))
> 
>           CALL MPI_Comm_size(PETSC_COMM_WORLD, work_size, ierr);
>           CALL MPI_Comm_rank(PETSC_COMM_WORLD, work_rank, ierr);
>           if (work_rank == 0) then
>             glb_index(0) = 2
>             glb_index(1) = 3
>             glb_index(2) = 4
>             array(0) = 2
>             array(1) = 3
>             array(2) = 4
>           else if (work_rank == 1) then
>             glb_index(0) = 0
>             glb_index(1) = 1
>             glb_index(2) = 5
>             array(0) = 0
>             array(1) = 1
>             array(2) = 5
>           end if
> 
>           ! Create and fill rhs vector
>           CALL VecCreate(PETSC_COMM_WORLD, b, ierr);
>           CALL VecSetSizes(b, Lsize, Psize, ierr);
>           CALL VecSetType(b, VECMPI, ierr);
> 
>           CALL VecSetPreallocationCOO(b, csize, glb_index, ierr)
>           CALL VecSetValuesCOO(b, array, INSERT_VALUES, ierr)
> 
>           CALL VecGetArrayReadF90(b, xx_v, ierr)
> 
>           do i=1,Lsize
>             write(*,*) 'process ', work_rank, ': xx_v(',i,') = ', xx_v(i)
>           end do
> 
>           CALL VecRestoreArrayReadF90(b, xx_v, ierr)
> 
>           deallocate(glb_index, array)
>           CALL VecDestroy(b,ierr)
> 
>           CALL PetscFinalize(ierr)
> 
>     end program main
> 
> 
>     On 19/10/2023 17:36, Matthew Knepley wrote:
>      > On Thu, Oct 19, 2023 at 11:33 AM Enrico <degregori at dkrz.de
>     <mailto:degregori at dkrz.de>
>      > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>> wrote:
>      >
>      >     The layout is not poor, just the global indices are not
>     contiguous,this
>      >     has nothing to do with the local memory layout which is extremely
>      >     optimized for different architectures. I can not change the
>     layout
>      >     anyway because it's a climate model with a million lines of code.
>      >
>      >     I don't understand why Petsc is doing all this MPI
>     communication under
>      >     the hood.
>      >
>      >
>      > I don't think we are communicating under the hood.
>      >
>      >     I mean, it is changing the layout of the application and doing
>      >     a lot of communication.
>      >
>      >
>      > We do not create the layout. The user creates the data layout
>     when they
>      > create a vector or matrix.
>      >
>      >     Is there no way to force the same layout and
>      >     provide info about how to do the halo exchange? In this way I
>     can have
>      >     the same memory layout and there is no communication when I
>     fill or
>      >     fetch the vectors and the matrix.
>      >
>      >
>      > Yes, you tell the vector/matrix your data layout when you create it.
>      >
>      >    Thanks,
>      >
>      >        Matt
>      >
>      >     Cheers,
>      >     Enrico
>      >
>      >     On 19/10/2023 17:21, Matthew Knepley wrote:
>      >      > On Thu, Oct 19, 2023 at 10:51 AM Enrico <degregori at dkrz.de
>     <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >      > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>> wrote:
>      >      >
>      >      >     In the application the storage is contiguous but the
>     global
>      >     indexing is
>      >      >     not. I would like to use AO as a translation layer but
>     I don't
>      >      >     understand it.
>      >      >
>      >      >
>      >      > Why would you choose to index differently from your storage?
>      >      >
>      >      >     My case is actually simple even if it is in a large
>      >     application, I have
>      >      >
>      >      >     Mat A, Vec b and Vec x
>      >      >
>      >      >     After calling KSPSolve, I use VecGetArrayReadF90 to get a
>      >     pointer to
>      >      >     the
>      >      >     data and they are in the wrong ordering, so for
>     example the first
>      >      >     element of the solution array on process 0 belongs to
>     process
>      >     1 in the
>      >      >     application.
>      >      >
>      >      >
>      >      > Again, this seems to be a poor choice of layout. What we
>      >     typically do is
>      >      > to partition
>      >      > the data into chunks owned by each process first.
>      >      >
>      >      >     Is it at this point that I should use the AO translation
>      >     layer? This
>      >      >     would be quite bad, it means to build Mat A and Vec b
>     there
>      >     is MPI
>      >      >     communication and also to get the data of Vec x back
>     in the
>      >     application.
>      >      >
>      >      >
>      >      > If you want to store data that process i updates on process j,
>      >     this will
>      >      > need communication.
>      >      >
>      >      >     Anyway, I've tried to use
>     AOPetscToApplicationPermuteReal on the
>      >      >     solution array but it doesn't work as I would like. Is
>     this
>      >     function
>      >      >     suppose to do MPI communication between processes and
>     fetch
>      >     the values
>      >      >     of the application ordering?
>      >      >
>      >      >
>      >      > There is no communication here. That function call just
>     changes one
>      >      > integer into another.
>      >      > If you want to update values on another process, we
>     recommend using
>      >      > VecScatter() or
>      >      > MatSetValues(), both of which take global indices and do
>      >     communication
>      >      > if necessary.
>      >      >
>      >      >    Thanks,
>      >      >
>      >      >      Matt
>      >      >
>      >      >     Cheers,
>      >      >     Enrico
>      >      >
>      >      >     On 19/10/2023 15:25, Matthew Knepley wrote:
>      >      >      > On Thu, Oct 19, 2023 at 8:57 AM Enrico
>     <degregori at dkrz.de <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >      > <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de> <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>> wrote:
>      >      >      >
>      >      >      >     Maybe I wasn't clear enough. I would like to
>      >     completely get
>      >      >     rid of
>      >      >      >     Petsc
>      >      >      >     ordering because I don't want extra
>     communication between
>      >      >     processes to
>      >      >      >     construct the vector and the matrix (since I
>     have to fill
>      >      >     them every
>      >      >      >     time step because I'm just using the linear solver
>      >     with a Mat
>      >      >     and a Vec
>      >      >      >     data structure). I don't understand how I can
>     do that.
>      >      >      >
>      >      >      >
>      >      >      > Any program you write to do linear algebra will have
>      >     contiguous
>      >      >     storage
>      >      >      > because it
>      >      >      > is so much faster. Contiguous indexing makes sense for
>      >     contiguous
>      >      >      > storage. If you
>      >      >      > want to use non-contiguous indexing for contiguous
>      >     storage, you
>      >      >     would
>      >      >      > need some
>      >      >      > translation layer. The AO is such a translation,
>     but you
>      >     could do
>      >      >     this
>      >      >      > any way you want.
>      >      >      >
>      >      >      >    Thanks,
>      >      >      >
>      >      >      >       Matt
>      >      >      >
>      >      >      >     My initial idea was to create another global index
>      >     ordering
>      >      >     within my
>      >      >      >     application to use only for the Petsc interface
>     but then I
>      >      >     think that
>      >      >      >     the ghost cells are wrong.
>      >      >      >
>      >      >      >     On 19/10/2023 14:50, Matthew Knepley wrote:
>      >      >      >      > On Thu, Oct 19, 2023 at 6:51 AM Enrico
>      >     <degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >      >     <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de> <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>
>      >      >      >      > <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>>> wrote:
>      >      >      >      >
>      >      >      >      >     Hello,
>      >      >      >      >
>      >      >      >      >     if I create an application ordering using
>      >      >     AOCreateBasic, should I
>      >      >      >      >     provide the same array for const PetscInt
>      >     myapp[] and
>      >      >     const
>      >      >      >     PetscInt
>      >      >      >      >     mypetsc[] in order to get the same
>     ordering of the
>      >      >     application
>      >      >      >      >     within PETSC?
>      >      >      >      >
>      >      >      >      >
>      >      >      >      > Are you asking if the identity permutation
>     can be
>      >     constructed
>      >      >      >     using the
>      >      >      >      > same array twice? Yes.
>      >      >      >      >
>      >      >      >      >     And once I define the ordering so that
>     the local
>      >      >     vector and
>      >      >      >     matrix are
>      >      >      >      >     defined in PETSC as in my application,
>     how can
>      >     I use it to
>      >      >      >     create the
>      >      >      >      >     actual vector and matrix?
>      >      >      >      >
>      >      >      >      >
>      >      >      >      > The vectors and matrices do not change. The
>     AO is a
>      >      >     permutation.
>      >      >      >     You can
>      >      >      >      > use it to permute
>      >      >      >      > a vector into another order, or to convert
>     on index to
>      >      >     another.
>      >      >      >      >
>      >      >      >      >    Thanks,
>      >      >      >      >
>      >      >      >      >        Matt
>      >      >      >      >
>      >      >      >      >     Thanks in advance for the help.
>      >      >      >      >
>      >      >      >      >     Cheers,
>      >      >      >      >     Enrico
>      >      >      >      >
>      >      >      >      >     On 18/10/2023 13:39, Matthew Knepley wrote:
>      >      >      >      >      > On Wed, Oct 18, 2023 at 5:55 AM Enrico
>      >      >     <degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >      >     <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de> <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>
>      >      >      >      >     <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>>
>      >      >      >      >      > <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>
>      >      >      >     <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de> <mailto:degregori at dkrz.de
>     <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
>      >      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
>      >     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
>     <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>>>> wrote:
>      >      >      >      >      >
>      >      >      >      >      >     Hello,
>      >      >      >      >      >
>      >      >      >      >      >     I'm trying to use Petsc to solve
>     a linear
>      >      >     system in an
>      >      >      >      >     application. I'm
>      >      >      >      >      >     using the coordinate format to
>     define the
>      >      >     matrix and the
>      >      >      >      >     vector (it
>      >      >      >      >      >     should work better on GPU but at
>     the moment
>      >      >     every test
>      >      >      >     is on
>      >      >      >      >     CPU).
>      >      >      >      >      >     After
>      >      >      >      >      >     the call to VecSetValuesCOO, I've
>      >     noticed that the
>      >      >      >     vector is
>      >      >      >      >     storing
>      >      >      >      >      >     the
>      >      >      >      >      >     data in a different way from my
>      >     application. For
>      >      >      >     example with two
>      >      >      >      >      >     processes in the application
>      >      >      >      >      >
>      >      >      >      >      >     process 0 owns cells 2, 3, 4
>      >      >      >      >      >
>      >      >      >      >      >     process 1 owns cells 0, 1, 5
>      >      >      >      >      >
>      >      >      >      >      >     But in the vector data structure
>     of Petsc
>      >      >      >      >      >
>      >      >      >      >      >     process 0 owns cells 0, 1, 2
>      >      >      >      >      >
>      >      >      >      >      >     process 1 owns cells 3, 4, 5
>      >      >      >      >      >
>      >      >      >      >      >     This is in principle not a big issue,
>      >     but after
>      >      >      >     solving the
>      >      >      >      >     linear
>      >      >      >      >      >     system I get the solution vector
>     x and I
>      >     want
>      >      >     to get the
>      >      >      >      >     values in the
>      >      >      >      >      >     correct processes. Is there a way
>     to get
>      >     vector
>      >      >     values
>      >      >      >     from other
>      >      >      >      >      >     processes or to get a mapping so
>     that I
>      >     can do
>      >      >     it myself?
>      >      >      >      >      >
>      >      >      >      >      >
>      >      >      >      >      > By definition, PETSc vectors and
>     matrices own
>      >      >     contiguous row
>      >      >      >      >     blocks. If
>      >      >      >      >      > you want to have another,
>      >      >      >      >      > global ordering, we support that with
>      >      >      >      >      >
>     https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>
>      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>>
>      >      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>
>      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>>>
>      >      >      >      >      >
>     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>
>      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>>
>      >      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>
>      >      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>
>      >      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>
>      >     <https://petsc.org/main/manualpages/AO/
>     <https://petsc.org/main/manualpages/AO/>>>>>>
>      >      >      >      >      >
>      >      >      >      >      >    Thanks,
>      >      >      >      >      >
>      >      >      >      >      >       Matt
>      >      >      >      >      >
>      >      >      >      >      >     Cheers,
>      >      >      >      >      >     Enrico Degregori
>      >      >      >      >      >
>      >      >      >      >      >
>      >      >      >      >      >
>      >      >      >      >      > --
>      >      >      >      >      > What most experimenters take for
>     granted before
>      >      >     they begin
>      >      >      >     their
>      >      >      >      >      > experiments is infinitely more
>     interesting
>      >     than any
>      >      >      >     results to which
>      >      >      >      >      > their experiments lead.
>      >      >      >      >      > -- Norbert Wiener
>      >      >      >      >      >
>      >      >      >      >      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>>
>      >      >      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>>>
>      >      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>>
>      >      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>>>>
>      >      >      >      >
>      >      >      >      >
>      >      >      >      >
>      >      >      >      > --
>      >      >      >      > What most experimenters take for granted before
>      >     they begin
>      >      >     their
>      >      >      >      > experiments is infinitely more interesting
>     than any
>      >      >     results to which
>      >      >      >      > their experiments lead.
>      >      >      >      > -- Norbert Wiener
>      >      >      >      >
>      >      >      >      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>>
>      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>
>      >      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>>>
>      >      >      >
>      >      >      >
>      >      >      >
>      >      >      > --
>      >      >      > What most experimenters take for granted before
>     they begin
>      >     their
>      >      >      > experiments is infinitely more interesting than any
>      >     results to which
>      >      >      > their experiments lead.
>      >      >      > -- Norbert Wiener
>      >      >      >
>      >      >      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
>      >      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>>
>      >      >
>      >      >
>      >      >
>      >      > --
>      >      > What most experimenters take for granted before they begin
>     their
>      >      > experiments is infinitely more interesting than any
>     results to which
>      >      > their experiments lead.
>      >      > -- Norbert Wiener
>      >      >
>      >      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>
>      >
>      >
>      >
>      > --
>      > What most experimenters take for granted before they begin their
>      > experiments is infinitely more interesting than any results to which
>      > their experiments lead.
>      > -- Norbert Wiener
>      >
>      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>


More information about the petsc-users mailing list