[petsc-users] Using matrix-free with KSP

Barry Smith bsmith at petsc.dev
Mon Sep 12 15:17:46 CDT 2022


  I understand in principle, I just cannot wrap my head around a sphere :-)


> On Sep 12, 2022, at 3:45 PM, Tu, Jiannan <Jiannan_Tu at uml.edu> wrote:
> 
> Simulation domain is in spherical coordinates. The grid along co-latitude direction is that j = 0 on one side of the pole and j = -1 (in fact index 0) on the other side of the pole. The longitude index k for j = 0, say is 0 degree then for j = -1 is 180 degrees. kcm accounts for this change in k-index when going from one side of the pole to another side.
> 
> I'll see if I can solve the problem with the method you suggested. 
> 
> Thank you very much.
> Jiannan
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>>
> Sent: Monday, September 12, 2022 9:31 AM
> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
> Cc: petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Using matrix-free with KSP
>  
> CAUTION: This email was sent from outside the UMass Lowell network.
> 
> 
>   I am not able to visualize the pattern you refer to but I am pretty sure it is not a type of connectivity that DMDA handles by default since DMDA is for relatively simply structured meshes. Here is what I suggest.
> 
>    The routine that does the preallocation for DMDA that you are using is in src/dm/impls/da/fdda.c called DMCreateMatrix_DA_3d_MPIAIJ_Fill().  On each MPI rank it determines the connectivity for each local grid vertex (calledthe variable row in the routine) with all the other grid vertices (these are stored  temporarily  in the array cols[]).
> The call to MatPreallocateSetLocal() takes this coupling information and stores it temporarily. The calls to 
> 
>   PetscCall(MatSeqAIJSetPreallocation(J, 0, dnz));
>   PetscCall(MatMPIAIJSetPreallocation(J, 0, dnz, 0, onz));
>   MatPreallocateEnd(dnz, onz);
>   PetscCall(MatSetLocalToGlobalMapping(J, ltog, ltog));
> 
> at the end of the loop over the local grid vertices then provide the information to the matrix so it now has the correct preallocation and correct local to global mapping needed by MatSetValuesStencil()
> 
> The same loop structure is then called again where it inserts zero values into the matrix at all the correctly preallocated locations.
> 
> You can copy this routine and modify it slightly to include the "special coupling" that your problem requires. I do not understand your formula for kcm so cannot tell you exactly what you need to change but it may correspond to remapping the kk values at the extremes of the loop for (kk = kstart; kk < kend + 1; kk++) {  so that the
> values nc * (slot + ii + gnx * jj + gnx * gny * kk) correspond to the correct coupling location. I recommend drawing a picture of a case with a very small size for nx,ny, and nz so you can see exactly how the mapping takes place and trace through your modified code that it does what you need.
> 
> Barry
> 
> 
> 
> 
>> On Sep 11, 2022, at 11:01 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>> 
>> Barry,
>>  
>> Creating DNDA with periodic boundary BC and using km=k-1; kp=k+1 even at the BCs solve the “augment out of range” problem for inserting elements at theae boundaries of k-index. Now the problem is due to wrapping around the north pole when j=0 and jm = -1 is reset as j=0 but on the other side of the pole with k-index replacing by kcm = [kmax+1)/2\ % (kmax+1). In my case, the “Augment out of range” happens at global row/column (26, 5265026), exactly at wrapping jm=-1 to j=0 to the other side.
>>  
>> I am thinking about how to properly use ghost grid j = -1 by setting appropriate BC there and inserting at that location without wrapping.
>>  
>> Jiannan
>>  
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
>> Sent: Sunday, September 11, 2022 12:10 PM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Cc: petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> 
>> 
>> On Sep 11, 2022, at 9:56 AM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry,
>>  
>> Thank you.
>>  
>> Yes, the DMDA is created with periodic BC. But it might not be the periodic BC since in azimuthal direction, the ghost grid -1 is actually grid Nmax, and Nmax+1 is grid 0.
>>  
>>    Consider the following, one has n points from 0 to n-1  where n = 4, so I think with your definition of Nmax, Nmax is 3, Nmax + 1 is 4,
>>  
>>  
>>                                                     -1   0  1   2  3  4
>>                                                      +    *   *   *   * +
>>  
>> Thus there is one grid point at each end.  The * represent regular grid points and the + ghost locations. Is this your situation? 
>>  
>> This is what we call periodic with a stencil width of 1 with DMDA and the DMCreateMatrix() should automatically provide the correct preallocation and nonzero structure.
>>  
>> Can you please do a MatView() on the result from DMCreateMatrix() for a very small grid before you set any values and you should see the locations for the coupling across the periodic boundary.
>>  
>> It sounds like you have a more complicated cross periodic coupling, we need to understand it exactly so we can determine how to proceed.
>>  
>>   Barry
>>  
>>  
>> 
>>  
>> Still the problem is how to properly pre-allocate matrix. Otherwise either “Augument out of range” error or code stuck at inserting values if MatSetOption() is called.
>>  
>> Jiannan
>>  
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
>> Sent: Sunday, September 11, 2022 12:10 AM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Cc: petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>>   Have you created the DMDA with the periodic boundary conditions arguments? Or are your boundary conditions not simply periodic? 
>>  
>>   Compare DMCreateMatrix_DA_3d_MPIAIJ() and DMCreateMatrix_DA_3d_MPIAIJ_Fill() in src/dm/impls/da/fdda.c I think they both attempt to handle the periodic case.
>>  
>>   Barry
>>  
>>  
>>  
>> 
>> 
>> 
>> On Sep 10, 2022, at 10:14 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry,
>>  
>> I think I understand how to set values for the intra-grid coupling. But I am not clear about the way to set values for inter-grid coupling. When a component couples to itself at above, below, right, left, front and back grids, diagonal is set as 1 for the second array of DMDASetBlockFills(). How about if involving more grids? How does DMDA know where to allocate memory? In my case, a component couples to itself at the other end of the grid because of periodic boundary condition. There is an error message “Augment out of range . Insert new nonzero at global row/column (26, 520526) into matrix” because of this far away coupling. The column index 520526 is related to the other end of the grid in azimuthal direction.
>>  
>> Thanks,
>> Jiannan
>>  
>>  
>>  
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
>> Sent: Saturday, September 10, 2022 1:10 PM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Cc: petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>>   If you are using DMCreateMatrix() then MatMPIAIJSetPreallocation is not needed, it is automatically handled by the DM.
>>  
>>   It sounds like that the values you set in DMDASetBlockFills() did not include all the coupling, hence the preallocation was wrong, hence the time to insert values was too long.
>>  
>>   Barry
>>  
>> 
>> 
>> 
>> 
>> On Sep 10, 2022, at 8:16 AM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Just found MatMPIAIJSetPreallocation is needed. Now Jacobin insertion and assembly is fast.
>>  
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
>> Sent: Thursday, September 8, 2022 9:53 PM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> 
>> 
>> 
>> 
>> 
>> On Sep 8, 2022, at 8:34 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry,
>>  
>> Thank you.
>>  
>> I set up these two arrays according to the non-zero pattern of the Jacobian. The DMDASetBlockFills() is then called after the call to DMCreateMatrix(da, &A).
>>  
>>   Absolutely not! It must be called before DMCreateMatrix() so that the DM knows how to construct exactly the matrix you need.
>>  
>> 
>> 
>> 
>> 
>> 
>> Nevertheless, the program execution is still killed with exit code 9, which I assume is due to overuse of the memory. My desktop machine has 32 GB RAM.
>>  
>> I ran the code with a smaller mesh, 100x45x90, the result is the same. With that number of grids, 26 unknowns, and up to 10 non-zero at each matrix row, the estimated memory to store 100x45x90x26x10 elements should be less than 1 GB (double precision). I am wondering why the matrix still takes too much memory. Maybe I have to use matrix-free?
>>  
>> Thank you,
>> Jiannan
>>  
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
>> Sent: Thursday, September 8, 2022 4:19 PM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> 
>> 
>> 
>> 
>> 
>> 
>> On Sep 8, 2022, at 3:39 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry,
>>  
>> Thank you very much for the detailed description.
>>  
>> So the first array, dfill, is for intra grids and the second one, ofill is for the inter-grid coupling?
>>  
>> In the case for inter-grid coupling, a component is only coupled to itself, say x in your example is coupled to itself on above, below, right, left, front, and back, how to set values for the second array? 
>>  
>> Then the off-diagonal one would just have values on its diagonal.
>> 
>> 
>> 
>> 
>> 
>>  
>> Jiannan
>>  
>> From: Barry Smith <mailto:bsmith at petsc.dev>
>> Sent: Thursday, September 8, 2022 2:12 PM
>> To: Tu, Jiannan <mailto:Jiannan_Tu at uml.edu>
>> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> On Sep 8, 2022, at 1:47 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry,
>>  
>> Thank you very much.
>>  
>> DMDASetBlockFills() needs two input arrays specifying sparsity pattern. In my case, I have 26 components at each grid. Is it correct that I need to have two arrays of 26x26 elements with value 1 for coupling and 0 for no coupling?
>>  
>>    Exactly
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> Also how to determine when the value should be 1 or 0? 
>>  
>>   That is problem dependent. Given paper-and-pencil you know for your stencil which components are coupled with other components at each grid point and which components are coupled with components at neighboring grid points. 
>>  
>>   Note the second array contains information about all the neighboring grid points. Above, below, right, left, front back. Normally the coupling is symmetric so the nonzero structures of those 6 blocks are identical. If, for you particular problem, the nonzero structures of the six are not identical you use the union of the nonzero structures of the six of them.
>>  
>>  
>>   2d example     say you have x,y,z at each grid point and the inter-point coupling is x is connected to y and y is connect to x then the first array is
>>  
>> [ 1 1 0 ; 1 1; 0 ; 0 0 1]; the zeros are because x is not coupled to z and y is not coupled to z.
>>  
>> For intra grid points, say x is coupled to z and itself, z is coupled to x and itself and y is not coupled to itself then
>>  
>> [1 0 1; 0 0 0; 1 0 1]; the middle row is zero because y is not coupled to anything and the zeros in the first and last are because x and z are not coupled to y.
>>  
>>  
>>   
>> 
>> 
>> 
>> 
>> 
>> 
>> Each component involves 7 stencils (i-1, j, k), (I, j, k), (i+1, j, k), (I, j-1, k), (I, j+1, k), (I, j, k-1), and (I, j, k+1), and couples with several other components.
>>  
>> Jiannan
>>  
>>  
>>  
>> Sent from Mail <https://nam10.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgo.microsoft.com%2Ffwlink%2F%3FLinkId%3D550986&data=05%7C01%7CJiannan_Tu%40uml.edu%7Cedb088440d8f49a7945308da94c30f1d%7C4c25b8a617f746f983f054734ab81fb1%7C0%7C0%7C637985862751941007%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=lATq91D1DLuiC6UXbtjRO03g1K5eDk0CMJNWqKb0VY4%3D&reserved=0> for Windows
>>  
>> From: Barry Smith <mailto:bsmith at petsc.dev>
>> Sent: Wednesday, September 7, 2022 11:53 AM
>> To: Tu, Jiannan <mailto:Jiannan_Tu at uml.edu>
>> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> DMDASetBlockFills() or DMDASetBlockFillsSparse() (they are just different ways of providing the same information) will help you here enormously, your type of case is exactly what they were designed for.
>>  
>>  Barry
>>  
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> On Sep 7, 2022, at 10:47 AM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> Barry and Hong,
>>  
>> Thank you. 
>>  
>> There are 26 components at each grid and there are not fully coupled in terms of stiff functions. Mutual coupling is among about 6 components. 
>>  
>> I would prefer not using matrix-free since the Jacobina is not difficult to calculate and only up to 10 non-zeros at each row. I'll try DMDASetBlockFills() or DMDASetBlockFillsSparse() and see how they can reduce the memory usage.
>>  
>> Jiannan
>> <E6340AADF8494DB7861EA4659E7713CE.png>
>> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>>
>> Sent: Tuesday, September 6, 2022 11:33 PM
>> To: Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>>
>> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
>> Subject: Re: [petsc-users] Using matrix-free with KSP
>>  
>> CAUTION: This email was sent from outside the UMass Lowell network.
>>  
>>  
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> On Sep 6, 2022, at 11:00 PM, Tu, Jiannan <Jiannan_Tu at uml.edu <mailto:Jiannan_Tu at uml.edu>> wrote:
>>  
>> I am using TS IMEX to solve a large DAE system. The DAE is obtained by applying finite FV method to 3-D multi-specie ion/neutral fluids equations with magnetic induction equation. The Jacobian for stiff part is formed by using MatSetValuesStencil(). The Jacobian matrix is very sparse, no more than 10 non-zeros on each row. MatSetValuesStencil requires local to global mapping by calling ISLocaltoGlobalMapping(). Could you please instruct me how to use local to global mapping? 
>>  
>>    DMDA automatically sets up the ISLocaltoGlobalMapping() so you should not need to.
>>  
>> I also tried using DMCreateMatrix() to create the Jacobian. While local to global mapping is not necessary, the matrix takes too much memory and requires 64-bit indices. I would prefer to take the advantage of sparsity of the Jacobian, pre-allocate the matrix to use as less as possible memory so that the code can be run on a multi-core desktop.
>>  
>>   If using matrix-free does not work for you because the linear solves do not converge or converge too slowly. Then you might be able to decrease the memory used in the matrix. The DMCreateMatrix() does take advantage of sparsity and tries to preallocate only what it needs. Often what it preallocates is the best possible,  but for multicomponent problems it has to assume there is full coupling within all the degrees of freedom that live at each at each grid point. How many components live at each grid point in your model and are they not all coupled to each other in the equations? If they are not fully coupled you can use either DMDASetBlockFills() or DMDASetBlockFillsSparse() to indicate the reduced coupling that actually exists for you model.
>>  
>>   Barry
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>>  
>> Thank you very much for your advice.
>>  
>> Jiannan

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220912/6d9dd10b/attachment-0001.html>


More information about the petsc-users mailing list