[petsc-users] Slepc, shell matrix, parallel, halo exchange

feng wang snailsoar at hotmail.com
Wed Sep 21 09:35:32 CDT 2022


Hi Jose,

For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure.

If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo values. Suppose I create an auxiliary ghost vector x_g, then I assign the values of x to x_g. The values of the halo for each partition will not be assigned at this stage.

But If I call VecGhostUpdateBegin/End(x_g, INSERT_VALUES, SCATTER_FORWARD), this will fill the values of the halo cells of x_g for each partition. Then x_g has local and halo cells assigned correctly and I can use x_g to do my computation. Is this what you mean?

Thanks,
Feng

________________________________
From: Jose E. Roman <jroman at dsic.upv.es>
Sent: 21 September 2022 13:07
To: feng wang <snailsoar at hotmail.com>
Cc: Matthew Knepley <knepley at gmail.com>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange



> El 21 sept 2022, a las 14:47, feng wang <snailsoar at hotmail.com> escribió:
>
> Thanks Jose, I will try this and will come back to this thread if I have any issue.
>
> Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards?

Eigenvectors are stored in parallel vectors, which are used in subsequent parallel computation in most applications. If for some reason you need to gather them in a single MPI process you can use e.g. VecScatterCreateToZero()

>
> Thanks,
> Feng
>
> From: Jose E. Roman <jroman at dsic.upv.es>
> Sent: 21 September 2022 12:34
> To: feng wang <snailsoar at hotmail.com>
> Cc: Matthew Knepley <knepley at gmail.com>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work.
>
> A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y.
>
> Jose
>
>
> > El 21 sept 2022, a las 14:09, feng wang <snailsoar at hotmail.com> escribió:
> >
> > Thanks for your reply.
> >
> > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is  a ghost vector and how many ghost cells there are?
> >
> > Thanks,
> > Feng
> > From: Matthew Knepley <knepley at gmail.com>
> > Sent: 21 September 2022 11:58
> > To: feng wang <snailsoar at hotmail.com>
> > Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
> >
> > On Wed, Sep 21, 2022 at 7:41 AM feng wang <snailsoar at hotmail.com> wrote:
> > Hello,
> >
> > I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel.
> >
> > The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used  VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc?
> >
> > That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult().
> >
> >   Thanks,
> >
> >      Matt
> >
> > Thanks,
> > Feng
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220921/84b899bc/attachment.html>


More information about the petsc-users mailing list