[petsc-users] Efficient way to access and use entries of an MPI vector with ghost values
Marco Tiberga
M.Tiberga at tudelft.nl
Tue Dec 18 09:36:56 CST 2018
Dear all,
We are using PETSc in our (Fortran) CFD code, which we have recently parallelized.
We need to access the values stored in the solution vector to assemble the matrix of the linear system. In particular, we need to access the values on the local elements, plus the ones on the first neighbors, which might be handled by other processes.
Hence, we defined our solution vector as MPI Vec with ghost values.
At the moment, to access all the required values, at each time step we update the ghost values, then call 'VecGhostGetLocalForm' and finally 'VecGetArrayReadF90'.
Since all these procedures are collective, we store the gotten values in a local Fortran array (of length n_local + n_ghosts) and then proceed with the parallel matrix assembly.
The code works perfectly, so I believe we are doing things in the right way.
However, I was wondering whether there is a more efficient way to access the local values of a ghosted vector; something not collective, so that we could access the values on-demand while assembling the matrix.
In this way, we could avoid storing the values in Fortran array, thus saving memory.
Less important issue, but it puzzles me: Why is VecGetArrayReadF90 collective while its C counterpart VecGetArrayRead is not? (same thing for VecRestoreArrayReadF90 and VecRestoreArrayRead).
Thanks a lot for your time and help.
Best regards,
Marco Tiberga
PhD candidate
Delft University of Technology
Faculty of Applied Sciences
Radiation Science & Technology Department
Mekelweg 15, 2629 JB Delft, The Netherlands
E-Mail: m.tiberga at tudelft.nl<mailto:m.tiberga at tudelft.nl>
Website: http://www.nera.rst.tudelft.nl/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181218/1eba90d8/attachment.html>
More information about the petsc-users
mailing list