[petsc-users] Parallelizing a matrix-free code

Stefano Zampini stefano.zampini at gmail.com
Mon Oct 16 03:32:08 CDT 2017


2017-10-16 10:26 GMT+03:00 Michael Werner <michael.werner at dlr.de>:

> Hello,
>
> I'm having trouble with parallelizing a matrix-free code with PETSc. In
> this code, I use an external CFD code to provide the matrix-vector product
> for an iterative solver in PETSc. To increase convergence rate, I'm using
> an explicitly stored Jacobian matrix to precondition the solver. This works
> fine for serial runs. However, when I try to use multiple processes, I face
> the problem that PETSc decomposes the preconditioner matrix, and probably
> also the shell matrix, in a different way than the external CFD code
> decomposes the grid.
>
> The Jacobian matrix is built in a way, that its rows and columns
> correspond to the global IDs of the individual points in my CFD mesh
>
> The CFD code decomposes the domain based on the proximity of points to
> each other, so that the resulting subgrids are coherent. However, since its
> an unstructured grid, those subgrids are not necessarily made up of points
> with successive global IDs. This is a problem, since PETSc seems to
> partition the matrix in  coherent slices.
>
> I'm not sure what the best approach to this problem might be. Is it maybe
> possible to exactly tell PETSc, which rows/columns it should assign to the
> individual processes?
>
>
If you are explicitly setting the values in your Jacobians via
MatSetValues(), you can create a ISLocalToGlobalMapping

http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html

that maps the numbering you use for the Jacobians to their counterpart in
the CFD ordering, then call MatSetLocalToGlobalMapping and then use
MatSetValuesLocal with the same arguments you are calling MatSetValues now.

Otherwise, you can play with the application ordering
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html




-- 
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171016/b4bde31e/attachment-0001.html>


More information about the petsc-users mailing list