[petsc-users] Using matrix-free with KSP
Barry Smith
bsmith at petsc.dev
Tue Sep 6 22:33:32 CDT 2022
> On Sep 6, 2022, at 11:00 PM, Tu, Jiannan <Jiannan_Tu at uml.edu> wrote:
>
> I am using TS IMEX to solve a large DAE system. The DAE is obtained by applying finite FV method to 3-D multi-specie ion/neutral fluids equations with magnetic induction equation. The Jacobian for stiff part is formed by using MatSetValuesStencil(). The Jacobian matrix is very sparse, no more than 10 non-zeros on each row. MatSetValuesStencil requires local to global mapping by calling ISLocaltoGlobalMapping(). Could you please instruct me how to use local to global mapping?
DMDA automatically sets up the ISLocaltoGlobalMapping() so you should not need to.
>
>
> I also tried using DMCreateMatrix() to create the Jacobian. While local to global mapping is not necessary, the matrix takes too much memory and requires 64-bit indices. I would prefer to take the advantage of sparsity of the Jacobian, pre-allocate the matrix to use as less as possible memory so that the code can be run on a multi-core desktop.
If using matrix-free does not work for you because the linear solves do not converge or converge too slowly. Then you might be able to decrease the memory used in the matrix. The DMCreateMatrix() does take advantage of sparsity and tries to preallocate only what it needs. Often what it preallocates is the best possible, but for multicomponent problems it has to assume there is full coupling within all the degrees of freedom that live at each at each grid point. How many components live at each grid point in your model and are they not all coupled to each other in the equations? If they are not fully coupled you can use either DMDASetBlockFills() or DMDASetBlockFillsSparse() to indicate the reduced coupling that actually exists for you model.
Barry
>
> Thank you very much for your advice.
>
> Jiannan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220906/211497dc/attachment-0001.html>
More information about the petsc-users
mailing list