[petsc-users] Slepc, shell matrix, parallel, halo exchange
feng wang
snailsoar at hotmail.com
Wed Sep 21 07:47:38 CDT 2022
Thanks Jose, I will try this and will come back to this thread if I have any issue.
Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards?
Thanks,
Feng
________________________________
From: Jose E. Roman <jroman at dsic.upv.es>
Sent: 21 September 2022 12:34
To: feng wang <snailsoar at hotmail.com>
Cc: Matthew Knepley <knepley at gmail.com>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work.
A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y.
Jose
> El 21 sept 2022, a las 14:09, feng wang <snailsoar at hotmail.com> escribió:
>
> Thanks for your reply.
>
> For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells there are?
>
> Thanks,
> Feng
> From: Matthew Knepley <knepley at gmail.com>
> Sent: 21 September 2022 11:58
> To: feng wang <snailsoar at hotmail.com>
> Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> On Wed, Sep 21, 2022 at 7:41 AM feng wang <snailsoar at hotmail.com> wrote:
> Hello,
>
> I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel.
>
> The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc?
>
> That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult().
>
> Thanks,
>
> Matt
>
> Thanks,
> Feng
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220921/4f4f58ab/attachment.html>
More information about the petsc-users
mailing list