[petsc-users] Coordinate format internal reordering
Enrico
degregori at dkrz.de
Thu Oct 19 10:33:16 CDT 2023
The layout is not poor, just the global indices are not contiguous,this
has nothing to do with the local memory layout which is extremely
optimized for different architectures. I can not change the layout
anyway because it's a climate model with a million lines of code.
I don't understand why Petsc is doing all this MPI communication under
the hood. I mean, it is changing the layout of the application and doing
a lot of communication. Is there no way to force the same layout and
provide info about how to do the halo exchange? In this way I can have
the same memory layout and there is no communication when I fill or
fetch the vectors and the matrix.
Cheers,
Enrico
On 19/10/2023 17:21, Matthew Knepley wrote:
> On Thu, Oct 19, 2023 at 10:51 AM Enrico <degregori at dkrz.de
> <mailto:degregori at dkrz.de>> wrote:
>
> In the application the storage is contiguous but the global indexing is
> not. I would like to use AO as a translation layer but I don't
> understand it.
>
>
> Why would you choose to index differently from your storage?
>
> My case is actually simple even if it is in a large application, I have
>
> Mat A, Vec b and Vec x
>
> After calling KSPSolve, I use VecGetArrayReadF90 to get a pointer to
> the
> data and they are in the wrong ordering, so for example the first
> element of the solution array on process 0 belongs to process 1 in the
> application.
>
>
> Again, this seems to be a poor choice of layout. What we typically do is
> to partition
> the data into chunks owned by each process first.
>
> Is it at this point that I should use the AO translation layer? This
> would be quite bad, it means to build Mat A and Vec b there is MPI
> communication and also to get the data of Vec x back in the application.
>
>
> If you want to store data that process i updates on process j, this will
> need communication.
>
> Anyway, I've tried to use AOPetscToApplicationPermuteReal on the
> solution array but it doesn't work as I would like. Is this function
> suppose to do MPI communication between processes and fetch the values
> of the application ordering?
>
>
> There is no communication here. That function call just changes one
> integer into another.
> If you want to update values on another process, we recommend using
> VecScatter() or
> MatSetValues(), both of which take global indices and do communication
> if necessary.
>
> Thanks,
>
> Matt
>
> Cheers,
> Enrico
>
> On 19/10/2023 15:25, Matthew Knepley wrote:
> > On Thu, Oct 19, 2023 at 8:57 AM Enrico <degregori at dkrz.de
> <mailto:degregori at dkrz.de>
> > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>> wrote:
> >
> > Maybe I wasn't clear enough. I would like to completely get
> rid of
> > Petsc
> > ordering because I don't want extra communication between
> processes to
> > construct the vector and the matrix (since I have to fill
> them every
> > time step because I'm just using the linear solver with a Mat
> and a Vec
> > data structure). I don't understand how I can do that.
> >
> >
> > Any program you write to do linear algebra will have contiguous
> storage
> > because it
> > is so much faster. Contiguous indexing makes sense for contiguous
> > storage. If you
> > want to use non-contiguous indexing for contiguous storage, you
> would
> > need some
> > translation layer. The AO is such a translation, but you could do
> this
> > any way you want.
> >
> > Thanks,
> >
> > Matt
> >
> > My initial idea was to create another global index ordering
> within my
> > application to use only for the Petsc interface but then I
> think that
> > the ghost cells are wrong.
> >
> > On 19/10/2023 14:50, Matthew Knepley wrote:
> > > On Thu, Oct 19, 2023 at 6:51 AM Enrico <degregori at dkrz.de
> <mailto:degregori at dkrz.de>
> > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
> > > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
> <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>> wrote:
> > >
> > > Hello,
> > >
> > > if I create an application ordering using
> AOCreateBasic, should I
> > > provide the same array for const PetscInt myapp[] and
> const
> > PetscInt
> > > mypetsc[] in order to get the same ordering of the
> application
> > > within PETSC?
> > >
> > >
> > > Are you asking if the identity permutation can be constructed
> > using the
> > > same array twice? Yes.
> > >
> > > And once I define the ordering so that the local
> vector and
> > matrix are
> > > defined in PETSC as in my application, how can I use it to
> > create the
> > > actual vector and matrix?
> > >
> > >
> > > The vectors and matrices do not change. The AO is a
> permutation.
> > You can
> > > use it to permute
> > > a vector into another order, or to convert on index to
> another.
> > >
> > > Thanks,
> > >
> > > Matt
> > >
> > > Thanks in advance for the help.
> > >
> > > Cheers,
> > > Enrico
> > >
> > > On 18/10/2023 13:39, Matthew Knepley wrote:
> > > > On Wed, Oct 18, 2023 at 5:55 AM Enrico
> <degregori at dkrz.de <mailto:degregori at dkrz.de>
> > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>
> > > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
> <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>
> > > > <mailto:degregori at dkrz.de
> <mailto:degregori at dkrz.de> <mailto:degregori at dkrz.de
> <mailto:degregori at dkrz.de>>
> > <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>
> <mailto:degregori at dkrz.de <mailto:degregori at dkrz.de>>>>> wrote:
> > > >
> > > > Hello,
> > > >
> > > > I'm trying to use Petsc to solve a linear
> system in an
> > > application. I'm
> > > > using the coordinate format to define the
> matrix and the
> > > vector (it
> > > > should work better on GPU but at the moment
> every test
> > is on
> > > CPU).
> > > > After
> > > > the call to VecSetValuesCOO, I've noticed that the
> > vector is
> > > storing
> > > > the
> > > > data in a different way from my application. For
> > example with two
> > > > processes in the application
> > > >
> > > > process 0 owns cells 2, 3, 4
> > > >
> > > > process 1 owns cells 0, 1, 5
> > > >
> > > > But in the vector data structure of Petsc
> > > >
> > > > process 0 owns cells 0, 1, 2
> > > >
> > > > process 1 owns cells 3, 4, 5
> > > >
> > > > This is in principle not a big issue, but after
> > solving the
> > > linear
> > > > system I get the solution vector x and I want
> to get the
> > > values in the
> > > > correct processes. Is there a way to get vector
> values
> > from other
> > > > processes or to get a mapping so that I can do
> it myself?
> > > >
> > > >
> > > > By definition, PETSc vectors and matrices own
> contiguous row
> > > blocks. If
> > > > you want to have another,
> > > > global ordering, we support that with
> > > > https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>
> > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>>
> > > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>
> > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>>>
> > > > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>
> > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>>
> > > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>
> > <https://petsc.org/main/manualpages/AO/
> <https://petsc.org/main/manualpages/AO/>>>>
> > > >
> > > > Thanks,
> > > >
> > > > Matt
> > > >
> > > > Cheers,
> > > > Enrico Degregori
> > > >
> > > >
> > > >
> > > > --
> > > > What most experimenters take for granted before
> they begin
> > their
> > > > experiments is infinitely more interesting than any
> > results to which
> > > > their experiments lead.
> > > > -- Norbert Wiener
> > > >
> > > > https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>
> > <https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>>
> > > <https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>
> > <https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>>>
> > > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
> > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>>
> > > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
> > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>>>>
> > >
> > >
> > >
> > > --
> > > What most experimenters take for granted before they begin
> their
> > > experiments is infinitely more interesting than any
> results to which
> > > their experiments lead.
> > > -- Norbert Wiener
> > >
> > > https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>
> > <https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>>
> > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
> > <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>>>
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/~knepley/>
> <http://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
More information about the petsc-users
mailing list