[petsc-users] Choosing VecScatter Method in Matrix-Vector Product
Jed Brown
jed at jedbrown.org
Wed Jan 22 11:36:02 CST 2020
Stefano Zampini <stefano.zampini at gmail.com> writes:
>> On Jan 22, 2020, at 6:11 PM, Felix Huber <st107539 at stud.uni-stuttgart.de> wrote:
>>
>> Hello,
>>
>> I currently investigate why our code does not show the expected weak scaling behaviour in a CG solver. Therefore I wanted to try out different communication methods for the VecScatter in the matrix-vector product. However, it seems like PETSc (version 3.7.6) always chooses either MPI_Alltoallv or MPI_Alltoallw when I pass different options via the PETSC_OPTIONS environment variable. Does anybody know, why this doesn't work as I expected?
>>
>> The matrix is a MPIAIJ matrix and created by a finite element discretization of a 3D Laplacian. Therefore it only communicates with 'neighboring' MPI ranks. Not sure if it helps, but the code is run on a Cray XC40.
>>
>> I tried the `ssend`, `rsend`, `sendfirst`, `reproduce` and no options from https://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/Vec/VecScatterCreate.html which all result in a MPI_Alltoallv. When combined with `nopack` the communication uses MPI_Alltoallw.
>>
>> Best regards,
>> Felix
>>
>
> 3.7.6 is a quite old version. You should consider upgrading
VecScatter has been greatly refactored (and the default implementation
is entirely new) since 3.7. Anyway, I'm curious about your
configuration and how you determine that MPI_Alltoallv/MPI_Alltoallw is
being used. This has never been a default code path, so I suspect
something in your environment or code making this happen.
More information about the petsc-users
mailing list