# [petsc-users] Using matrix-free with KSP

Tu, Jiannan Jiannan_Tu at uml.edu
Sun Sep 11 22:01:57 CDT 2022

```Barry,

Creating DNDA with periodic boundary BC and using km=k-1; kp=k+1 even at the BCs solve the "augment out of range" problem for inserting elements at theae boundaries of k-index. Now the problem is due to wrapping around the north pole when j=0 and jm = -1 is reset as j=0 but on the other side of the pole with k-index replacing by kcm = [kmax+1)/2\ % (kmax+1). In my case, the "Augment out of range" happens at global row/column (26, 5265026), exactly at wrapping jm=-1 to j=0 to the other side.

I am thinking about how to properly use ghost grid j = -1 by setting appropriate BC there and inserting at that location without wrapping.

Jiannan

From: Barry Smith <bsmith at petsc.dev>
Sent: Sunday, September 11, 2022 12:10 PM
To: Tu, Jiannan <Jiannan_Tu at uml.edu>
Cc: petsc-users <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

On Sep 11, 2022, at 9:56 AM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry,

Thank you.

Yes, the DMDA is created with periodic BC. But it might not be the periodic BC since in azimuthal direction, the ghost grid -1 is actually grid Nmax, and Nmax+1 is grid 0.

Consider the following, one has n points from 0 to n-1  where n = 4, so I think with your definition of Nmax, Nmax is 3, Nmax + 1 is 4,

-1   0  1   2  3  4
+    *   *   *   * +

Thus there is one grid point at each end.  The * represent regular grid points and the + ghost locations. Is this your situation?

This is what we call periodic with a stencil width of 1 with DMDA and the DMCreateMatrix() should automatically provide the correct preallocation and nonzero structure.

Can you please do a MatView() on the result from DMCreateMatrix() for a very small grid before you set any values and you should see the locations for the coupling across the periodic boundary.

It sounds like you have a more complicated cross periodic coupling, we need to understand it exactly so we can determine how to proceed.

Barry

Still the problem is how to properly pre-allocate matrix. Otherwise either "Augument out of range" error or code stuck at inserting values if MatSetOption() is called.

Jiannan

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: Sunday, September 11, 2022 12:10 AM
To: Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>>
Cc: petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

Have you created the DMDA with the periodic boundary conditions arguments? Or are your boundary conditions not simply periodic?

Compare DMCreateMatrix_DA_3d_MPIAIJ() and DMCreateMatrix_DA_3d_MPIAIJ_Fill() in src/dm/impls/da/fdda.c I think they both attempt to handle the periodic case.

Barry

On Sep 10, 2022, at 10:14 PM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry,

I think I understand how to set values for the intra-grid coupling. But I am not clear about the way to set values for inter-grid coupling. When a component couples to itself at above, below, right, left, front and back grids, diagonal is set as 1 for the second array of DMDASetBlockFills(). How about if involving more grids? How does DMDA know where to allocate memory? In my case, a component couples to itself at the other end of the grid because of periodic boundary condition. There is an error message "Augment out of range . Insert new nonzero at global row/column (26, 520526) into matrix" because of this far away coupling. The column index 520526 is related to the other end of the grid in azimuthal direction.

Thanks,
Jiannan

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: Saturday, September 10, 2022 1:10 PM
To: Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>>
Cc: petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

If you are using DMCreateMatrix() then MatMPIAIJSetPreallocation is not needed, it is automatically handled by the DM.

It sounds like that the values you set in DMDASetBlockFills() did not include all the coupling, hence the preallocation was wrong, hence the time to insert values was too long.

Barry

On Sep 10, 2022, at 8:16 AM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Just found MatMPIAIJSetPreallocation is needed. Now Jacobin insertion and assembly is fast.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: Thursday, September 8, 2022 9:53 PM
To: Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

On Sep 8, 2022, at 8:34 PM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry,

Thank you.

I set up these two arrays according to the non-zero pattern of the Jacobian. The DMDASetBlockFills() is then called after the call to DMCreateMatrix(da, &A).

Absolutely not! It must be called before DMCreateMatrix() so that the DM knows how to construct exactly the matrix you need.

Nevertheless, the program execution is still killed with exit code 9, which I assume is due to overuse of the memory. My desktop machine has 32 GB RAM.

I ran the code with a smaller mesh, 100x45x90, the result is the same. With that number of grids, 26 unknowns, and up to 10 non-zero at each matrix row, the estimated memory to store 100x45x90x26x10 elements should be less than 1 GB (double precision). I am wondering why the matrix still takes too much memory. Maybe I have to use matrix-free?

Thank you,
Jiannan

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: Thursday, September 8, 2022 4:19 PM
To: Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

On Sep 8, 2022, at 3:39 PM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry,

Thank you very much for the detailed description.

So the first array, dfill, is for intra grids and the second one, ofill is for the inter-grid coupling?

In the case for inter-grid coupling, a component is only coupled to itself, say x in your example is coupled to itself on above, below, right, left, front, and back, how to set values for the second array?

Then the off-diagonal one would just have values on its diagonal.

Jiannan

From: Barry Smith<mailto:bsmith at petsc.dev>
Sent: Thursday, September 8, 2022 2:12 PM
To: Tu, Jiannan<mailto:Jiannan_Tu at uml.edu>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

On Sep 8, 2022, at 1:47 PM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry,

Thank you very much.

DMDASetBlockFills() needs two input arrays specifying sparsity pattern. In my case, I have 26 components at each grid. Is it correct that I need to have two arrays of 26x26 elements with value 1 for coupling and 0 for no coupling?

Exactly

Also how to determine when the value should be 1 or 0?

That is problem dependent. Given paper-and-pencil you know for your stencil which components are coupled with other components at each grid point and which components are coupled with components at neighboring grid points.

Note the second array contains information about all the neighboring grid points. Above, below, right, left, front back. Normally the coupling is symmetric so the nonzero structures of those 6 blocks are identical. If, for you particular problem, the nonzero structures of the six are not identical you use the union of the nonzero structures of the six of them.

2d example     say you have x,y,z at each grid point and the inter-point coupling is x is connected to y and y is connect to x then the first array is

[ 1 1 0 ; 1 1; 0 ; 0 0 1]; the zeros are because x is not coupled to z and y is not coupled to z.

For intra grid points, say x is coupled to z and itself, z is coupled to x and itself and y is not coupled to itself then

[1 0 1; 0 0 0; 1 0 1]; the middle row is zero because y is not coupled to anything and the zeros in the first and last are because x and z are not coupled to y.

Each component involves 7 stencils (i-1, j, k), (I, j, k), (i+1, j, k), (I, j-1, k), (I, j+1, k), (I, j, k-1), and (I, j, k+1), and couples with several other components.

Jiannan

From: Barry Smith<mailto:bsmith at petsc.dev>
Sent: Wednesday, September 7, 2022 11:53 AM
To: Tu, Jiannan<mailto:Jiannan_Tu at uml.edu>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

DMDASetBlockFills() or DMDASetBlockFillsSparse() (they are just different ways of providing the same information) will help you here enormously, your type of case is exactly what they were designed for.

Barry

On Sep 7, 2022, at 10:47 AM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

Barry and Hong,

Thank you.

There are 26 components at each grid and there are not fully coupled in terms of stiff functions. Mutual coupling is among about 6 components.

I would prefer not using matrix-free since the Jacobina is not difficult to calculate and only up to 10 non-zeros at each row. I'll try DMDASetBlockFills() or DMDASetBlockFillsSparse() and see how they can reduce the memory usage.

Jiannan
From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: Tuesday, September 6, 2022 11:33 PM
To: Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.

On Sep 6, 2022, at 11:00 PM, Tu, Jiannan <Jiannan_Tu at uml.edu<mailto:Jiannan_Tu at uml.edu>> wrote:

I am using TS IMEX to solve a large DAE system. The DAE is obtained by applying finite FV method to 3-D multi-specie ion/neutral fluids equations with magnetic induction equation. The Jacobian for stiff part is formed by using MatSetValuesStencil(). The Jacobian matrix is very sparse, no more than 10 non-zeros on each row. MatSetValuesStencil requires local to global mapping by calling ISLocaltoGlobalMapping(). Could you please instruct me how to use local to global mapping?

DMDA automatically sets up the ISLocaltoGlobalMapping() so you should not need to.

I also tried using DMCreateMatrix() to create the Jacobian. While local to global mapping is not necessary, the matrix takes too much memory and requires 64-bit indices. I would prefer to take the advantage of sparsity of the Jacobian, pre-allocate the matrix to use as less as possible memory so that the code can be run on a multi-core desktop.

If using matrix-free does not work for you because the linear solves do not converge or converge too slowly. Then you might be able to decrease the memory used in the matrix. The DMCreateMatrix() does take advantage of sparsity and tries to preallocate only what it needs. Often what it preallocates is the best possible,  but for multicomponent problems it has to assume there is full coupling within all the degrees of freedom that live at each at each grid point. How many components live at each grid point in your model and are they not all coupled to each other in the equations? If they are not fully coupled you can use either DMDASetBlockFills() or DMDASetBlockFillsSparse() to indicate the reduced coupling that actually exists for you model.

Barry

Thank you very much for your advice.

Jiannan

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220912/7082d7e2/attachment-0001.html>
```