[petsc-users] Slepc, shell matrix, parallel, halo exchange

feng wang snailsoar at hotmail.com
Wed Sep 21 07:09:43 CDT 2022


Thanks for your reply.

For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is  a ghost vector and how many ghost cells there are?

Thanks,
Feng
________________________________
From: Matthew Knepley <knepley at gmail.com>
Sent: 21 September 2022 11:58
To: feng wang <snailsoar at hotmail.com>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

On Wed, Sep 21, 2022 at 7:41 AM feng wang <snailsoar at hotmail.com<mailto:snailsoar at hotmail.com>> wrote:
Hello,

I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel.

The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used  VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc?

That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult().

  Thanks,

     Matt

Thanks,
Feng




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220921/123576b7/attachment-0001.html>


More information about the petsc-users mailing list