[petsc-users] DMCreateMatrix with some dense row

Sang pham van pvsang002 at gmail.com
Tue Oct 27 20:55:21 CDT 2015


Hi Matt,

Can you please answer my second question about the way PESCs understand the
column indies? :

One more question I have is that, when preallocating for a node, as you can
see in my below code (based on original PETSc code), since I have 3
equations, I preallocated 3 rows (in rows[l] array), each row has a number
of columns index, all column indices of the three rows are stored in the
cols[] array. At this point I don't understand how PETSc know in that
cols[] arrays which column indices belonging to the first(second/third)
row? Please help me to understand this.

Thank you.

Pham


On Wed, Oct 28, 2015 at 8:48 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Oct 27, 2015 at 8:18 PM, Sang pham van <pvsang002 at gmail.com>
> wrote:
>
>> Hi Barry,
>>
>> I made my function for preallocating the DM matrix. I am using immersed
>> boundary method to solve a problem of solid mechanics (3D, linear
>> elasticity).
>> At every solid nodes, I have 3 unknowns for displacement in 3 directions
>> (nc=3), my DM has box stencil. Thus, at a node I have 3 equations, for each
>> equation I preallocate 27*3  = 81 entries.
>> When running the code, I got many of the following error  when setting
>> values into the preallocated matrix:
>>
>
> You might have memory corruption. Run with valgrind.
>
>   Thanks,
>
>     Matt
>
>
>> [0]PETSC ERROR: --------------------- Error Message
>> ------------------------------------
>> [0]PETSC ERROR: Argument out of range!
>> [0]PETSC ERROR: Local index 1496271904 too large 121500 (max) at 0!
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: Petsc Release Version 3.4.5, Jun, 29, 2014
>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [0]PETSC ERROR: See docs/index.html for manual pages.
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: ./sasMAIN on a arch-linux-cxx-opt named sang by sang Wed
>> Oct 28 07:34:37 2015
>> [0]PETSC ERROR: Libraries linked from
>> /home/sang/petsc/petsc-3.4.5/arch-linux-cxx-opt/lib
>> [0]PETSC ERROR: Configure run at Thu Sep  3 23:04:15 2015
>> [0]PETSC ERROR: Configure options --download-mpich=1
>> --with-scalar-type=real --with-clanguage=cxx --download-mumps=1
>> --download-blacs=1 --download-parmetis=1 --download-scalapack=1
>> --with-debugging=0 --download-hypre=1 --with-fc=gfortran --download-metis=1
>> -download-cmake=1 --download-f-blas-lapack=1
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: ISLocalToGlobalMappingApply() line 444 in
>> /home/sang/petsc/petsc-3.4.5/src/vec/is/utils/isltog.c
>> [0]PETSC ERROR: MatSetValuesLocal() line 1968 in
>> /home/sang/petsc/petsc-3.4.5/src/mat/interface/matrix.c
>> [0]PETSC ERROR: MatSetValuesStencil() line 1339 in
>> /home/sang/petsc/petsc-3.4.5/src/mat/interface/matrix.c
>>
>> Can you please explain to me what is possible reason for this error? it
>> looks like there is problem with the mapping from LocaltoGobal index, but
>> that part I just copy from the original code.
>> One more question I have is that, when preallocating for a node, as you
>> can see in my below code (based on original PETSc code), since I have 3
>> equations, I preallocated 3 rows (in rows[l] array), each row has a number
>> of columns index, all column indices of the three rows are stored in the
>> cols[] array. At this point I don't understand how PETSc know in that
>> cols[] arrays which column indices belonging to the first(second/third)
>> row? Please help me to understand this.
>>
>> Thank you very much.
>> Pham
>>
>> Below are my code for preallocating the DM matrix;
>>
>> PetscErrorCode sasMatVecPetsc::DMCreateMatrix_DA_3d_MPIAIJ_pvs(DM da,Mat
>> pMat, sasSmesh* mesh,sasVector<int>& solidcells,sasVector<int>&
>> isolidcellnearBI,sasVector<int>& solidcellnearBI,sasVector<int>&
>> forcing_or_ghost_cells)
>> {
>>   PetscErrorCode         ierr;
>>   PetscInt               xs,ys,nx,ny,i,j,slot,gxs,gys,gnx,gny;
>>   PetscInt               m,n,dim,s,*cols = NULL,k,nc,*rows =
>> NULL,col,cnt,l,p,*dnz = NULL,*onz = NULL;
>>   PetscInt
>> istart,iend,jstart,jend,kstart,kend,zs,nz,gzs,gnz,ii,jj,kk;
>>   MPI_Comm               comm;
>>   PetscScalar            *values;
>>   DMDABoundaryType       bx,by,bz;
>>   ISLocalToGlobalMapping ltog,ltogb;
>>   DMDAStencilType        st;
>>
>>   PetscFunctionBegin;
>>
>>   ierr =
>> DMDAGetInfo(da,&dim,&m,&n,&p,0,0,0,&nc,&s,&bx,&by,&bz,&st);CHKERRQ(ierr);
>> // nc = 3
>>   col  = 2*s + 1;
>>   ierr = DMDAGetCorners(da,&xs,&ys,&zs,&nx,&ny,&nz);CHKERRQ(ierr);
>>   ierr =
>> DMDAGetGhostCorners(da,&gxs,&gys,&gzs,&gnx,&gny,&gnz);CHKERRQ(ierr);
>>   ierr = PetscObjectGetComm((PetscObject)da,&comm);CHKERRQ(ierr);
>>   ierr =
>> PetscMalloc2(nc,PetscInt,&rows,col*col*col*nc*nc,PetscInt,&cols);CHKERRQ(ierr);
>>   ierr = DMGetLocalToGlobalMapping(da,&ltog);CHKERRQ(ierr);
>>   ierr =
>> MatPreallocateInitialize(comm,nc*nx*ny*nz,nc*nx*ny*nz,dnz,onz);CHKERRQ(ierr);
>>   nCells = mesh->nCells;
>>   sasVector<int> cell_type(mesh->nCells,0);
>>   for(int i=0;i<solidcells.N;i++)  cell_type[solidcells[i]] = 1;
>>   for(int i=0;i<forcing_or_ghost_cells.N;i++)
>>  cell_type[forcing_or_ghost_cells[i]] = 2;
>>   for(i = 0;i<nCells;i++)
>>   {
>>     slot = mesh->Cells_ijk[i].i - gxs + gnx*(mesh->Cells_ijk[i].j - gys)
>> + gnx*gny*(mesh->Cells_ijk[i].k - gzs);
>>     cnt = 0;
>>     if(cell_type[i]==0) /// fluid cells
>>     {
>>       for (l=0; l<nc; l++)
>>       {
>>         cols[cnt++] = l + nc*slot;
>>         rows[l] = l + nc*slot;
>>       }
>>     }
>>
>>     if(cell_type[i]==1) /// solid cells:
>>     {
>>       for (l=0; l<nc; l++)
>>       {
>>        for (int ll=0; ll<nc; ll++)
>>         for (ii=-1; ii<2; ii++) {
>>           for (jj=-1; jj<2; jj++) {
>>             for (kk=-1; kk<2; kk++) {
>>                 cols[cnt++] = ll + nc*(slot + ii + gnx*jj + gnx*gny*kk);
>>               }
>>             }
>>           }
>>         rows[l] = l + nc*(slot);
>>       }
>>     }
>>     if(cell_type[i]!=2)
>>     ierr =
>> MatPreallocateSetLocal(ltog,nc,rows,ltog,cnt,cols,dnz,onz);CHKERRQ(ierr);
>>   }
>>   MatCreate(comm,&pMat);
>>
>> MatSetSizes(pMat,nc*gnx*gny*gnz,nc*gnx*gny*gnz,nc*gnx*gny*gnz,nc*gnx*gny*gnz);
>>   MatSetType(pMat,MATAIJ);
>>   ierr = MatSetBlockSize(pMat,nc);CHKERRQ(ierr);
>>   ierr = MatSeqAIJSetPreallocation(pMat,0,dnz);CHKERRQ(ierr);
>>   ierr = MatMPIAIJSetPreallocation(pMat,0,dnz,0,onz);CHKERRQ(ierr);
>>   ierr = MatPreallocateFinalize(dnz,onz);CHKERRQ(ierr);
>>   ierr = MatSetLocalToGlobalMapping(pMat,ltog,ltog);CHKERRQ(ierr);
>>
>>   ierr = PetscFree2(rows,cols);CHKERRQ(ierr);
>>
>>   PetscFunctionReturn(0);
>> }
>>
>> On Sun, Oct 25, 2015 at 12:46 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>> > On Oct 24, 2015, at 1:43 AM, Sang pham van <pvsang002 at gmail.com>
>>> wrote:
>>> >
>>> > Thank you, Dave,
>>> >
>>> > Can I just call My_DMCreateMatrix_DA_3d_MPIAIJ(DM da, Mat J, void*
>>> my_context) when I need to create my matrix, and forget about
>>> DMDASetGetMatrix()?
>>>
>>>   You can do that if that is all you need. In some contexts when
>>> something else in PETSc needs to create a specific matrix (like with
>>> geometric multigrid) then that code calls DMCreateMatrix() which would then
>>> use your creation routine.
>>>
>>>    So, in summary you can start by just calling your own routine and
>>> convert it to use DMSetGetMatrix() later if you need to.
>>>
>>>   Barry
>>>
>>> >
>>> > Thank you.
>>> >
>>> > Pham
>>> >
>>> > On Sat, Oct 24, 2015 at 1:33 PM, Dave May <dave.mayhem23 at gmail.com>
>>> wrote:
>>> > If you need to access a user defined context from within your
>>> CreateMatrix function, you can attach it to the DM via PetscObjectCompose
>>> >
>>> >
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscObjectCompose.html
>>> >
>>> > If your context is not a petsc object, you can use
>>> PetscContainerCreate(),
>>> >
>>> >
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscContainerCreate.html#PetscContainerCreate
>>> >
>>> > You would then set your user context pointer inside the container and
>>> then use PetscObjectCompose() to attach the container to the DM
>>> >
>>> > Thanks,
>>> >   Dave
>>> >
>>> >
>>> > On 24 October 2015 at 06:04, Sang pham van <pvsang002 at gmail.com>
>>> wrote:
>>> > Hi Barry,
>>> >
>>> > The routine DMCreateMatrix_DA_3d_MPIAIJ has 2 input arguments (DM da,
>>> Mat J), the function pointer in DMDASetGetMatrix() only accept function
>>> with that two arguments.
>>> > As you suggested, I am writing a routine (based on
>>> DMCreateMatrix_DA_3d_MPIAIJ()) to preallocate the matrix in the way I wish,
>>> to do that I think It needs to have one more input argument:
>>> My_DMCreateMatrix_DA_3d_MPIAIJ(DM da, Mat J, void* my_context). But
>>> DMDASetGetMatrix() will not accept pointer of this function. Please give a
>>> suggestion to overcome this.
>>> >
>>> > Thank you.
>>> >
>>> > Pham
>>> >
>>> >
>>> > On Fri, Sep 4, 2015 at 1:24 AM, Barry Smith <bsmith at mcs.anl.gov>
>>> wrote:
>>> >
>>> >     Look at DMCreateMatrix_DA_2d_MPIAIJ  (or the 3d version if working
>>> in 3d)   You need to copy this routine and add whatever additional
>>> preallocation information you need.   Then call DMDASetGetMatrix()  so that
>>> the DM will use your routine to create the matrix for you.
>>> >
>>> >    Barry
>>> >
>>> >
>>> >
>>> >
>>> > > On Sep 3, 2015, at 11:28 AM, Sang pham van <pvsang002 at gmail.com>
>>> wrote:
>>> > >
>>> > > Hi,
>>> > >
>>> > > I am using DMCreateMatrix to create matrix from a existed DM object
>>> and defined stencil.
>>> > > In my code, boundary nodes need to involve many inner nodes, thus
>>> matrix rows corresponding to boundary nodes are almost dense. How can I
>>> tell petsc that those rows need to be preallocated with more entries? I
>>> don't want to use MatMPIAIJSetPreallocation() since advantages of DM might
>>> be lost.
>>> > >
>>> > > Many thanks.
>>> > >
>>> > > Sam.
>>> > >
>>> > >
>>> >
>>> >
>>> >
>>> >
>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151028/e5b2d38c/attachment-0001.html>


More information about the petsc-users mailing list