[petsc-users] Choosing VecScatter Method in Matrix-Vector Product

Dave May dave.mayhem23 at gmail.com
Wed Jan 22 09:29:58 CST 2020


On Wed 22. Jan 2020 at 16:12, Felix Huber <st107539 at stud.uni-stuttgart.de>
wrote:

> Hello,
>
> I currently investigate why our code does not show the expected weak
> scaling behaviour in a CG solver.


Can you please send representative log files which characterize the lack of
scaling (include the full log_view)?

Are you using a KSP/PC configuration which should weak scale?

Thanks
Dave


Therefore I wanted to try out
> different communication methods for the VecScatter in the matrix-vector
> product. However, it seems like PETSc (version 3.7.6) always chooses
> either MPI_Alltoallv or MPI_Alltoallw when I pass different options via
> the PETSC_OPTIONS environment variable. Does anybody know, why this
> doesn't work as I expected?
>
> The matrix is a MPIAIJ matrix and created by a finite element
> discretization of a 3D Laplacian. Therefore it only communicates with
> 'neighboring' MPI ranks. Not sure if it helps, but the code is run on a
> Cray XC40.
>
> I tried the `ssend`, `rsend`, `sendfirst`, `reproduce` and no options
> from
>
> https://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/Vec/VecScatterCreate.html
> which all result in a MPI_Alltoallv. When combined with `nopack` the
> communication uses MPI_Alltoallw.
>
> Best regards,
> Felix
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200122/b72da73c/attachment-0001.html>


More information about the petsc-users mailing list