[petsc-users] MatSetValues error with ViennaCL types

Matthew Knepley knepley at gmail.com
Wed Aug 15 16:32:24 CDT 2018


On Wed, Aug 15, 2018 at 5:20 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:

> It seems to be resumed on: I do not know how to preallocate a DM Matrix
> correctly.
>

There is only one matrix type, Mat. There are no separate DM matrices. A DM
can create a matrix for you
using DMCreateMatrix(), but that is a Mat and it is preallocated correctly.
I am not sure what you are doing.

  Thanks,

    Matt


> The interesting part is that it only breaks when i need to populate a GPU
> matrix from MPI, so kudos on that, but it seems i need to do better on my
> code to get this setup working,
>
> Any help would be appreciated,
>
> Thanks,
>
>
>
> On Wed, Aug 15, 2018 at 2:15 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Wed, Aug 15, 2018 at 4:53 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:
>>
>>> Thanks Matthew,
>>>
>>> I try to do that when calling:
>>>
>>> call MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,
>>> PETSC_NULL_INTEGER,ierr)
>>>
>>> But i am not aware on how to do this for the DM if it needs something
>>> more specific/different,
>>>
>>
>> The error says that your preallocation is wrong for the values you are
>> putting in. The DM does not control either,
>> so I do not understand your email.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks,
>>>
>>> On Wed, Aug 15, 2018 at 1:51 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Wed, Aug 15, 2018 at 4:39 PM Manuel Valera <mvalera-w at sdsu.edu>
>>>> wrote:
>>>>
>>>>> Hello PETSc devs,
>>>>>
>>>>> I am running into an error when trying to use the MATMPIAIJVIENNACL
>>>>> Matrix type in MPI calls, the same code runs for MATSEQAIJVIENNACL type in
>>>>> one processor. The error happens when calling MatSetValues for this
>>>>> specific configuration. It does not occur when using MPI DMMatrix types
>>>>> only.
>>>>>
>>>>
>>>> The DM properly preallocates the matrix. I am assuming you do not here.
>>>>
>>>>    Matt
>>>>
>>>>
>>>>> Any help will be appreciated,
>>>>>
>>>>> Thanks,
>>>>>
>>>>>
>>>>>
>>>>> My program call:
>>>>>
>>>>> mpirun -n 2 ./gcmLEP.GPU tc=TestCases/LockRelease/LE_6x6x6/
>>>>> jid=tiny_cuda_test_n2 -ksp_type cg -dm_vec_type viennacl -dm_mat_type
>>>>> aijviennacl -pc_type saviennacl -log_view
>>>>>
>>>>>
>>>>> The error (repeats after each MatSetValues call):
>>>>>
>>>>> [1]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [1]PETSC ERROR: Argument out of range
>>>>> [1]PETSC ERROR: Inserting a new nonzero at global row/column (75, 50)
>>>>> into matrix
>>>>> [1]PETSC ERROR: See
>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>> shooting.
>>>>> [1]PETSC ERROR: Petsc Development GIT revision: v3.9.2-549-g779ab53
>>>>> GIT Date: 2018-05-31 17:31:13 +0300
>>>>> [1]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by valera
>>>>> Wed Aug 15 13:10:44 2018
>>>>> [1]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
>>>>> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
>>>>> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
>>>>> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64 --download-viennacl
>>>>> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 608 in
>>>>> /home/valera/petsc/src/mat/impls/aij/mpi/mpiaij.c
>>>>> [1]PETSC ERROR: #2 MatSetValues() line 1339 in
>>>>> /home/valera/petsc/src/mat/interface/matrix.c
>>>>>
>>>>>
>>>>> My Code structure:
>>>>>
>>>>> call DMCreateMatrix(daDummy,A,ierr)
>>>>>> call MatSetFromOptions(A,ierr)
>>>>>> call MPI_Comm_size(PETSC_COMM_WORLD, numprocs, ierr)
>>>>>> if (numprocs > 1) then  ! set matrix type parallel
>>>>>>     ! Get local size
>>>>>>     call DMDACreateNaturalVector(daDummy,Tmpnat,ierr)
>>>>>>     call VecGetLocalSize(Tmpnat,locsize,ierr)
>>>>>>     call VecDestroy(Tmpnat,ierr)
>>>>>>     ! Set matrix
>>>>>> #ifdef GPU
>>>>>>     call MatSetType(A,MATAIJVIENNACL,ierr)
>>>>>>     call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
>>>>>>     call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
>>>>>>     print*,'SETTING GPU TYPES'
>>>>>> #else
>>>>>>     call DMSetMatType(daDummy,MATMPIAIJ,ierr)
>>>>>>     call DMSetMatType(daDummy,VECMPI,ierr)
>>>>>>     call MatSetType(A,MATMPIAIJ,ierr)!
>>>>>> #endif
>>>>>>     call
>>>>>> MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,PETSC_NULL_INTEGER,ierr)
>>>>>> else                    ! set matrix type sequential
>>>>>> #ifdef GPU
>>>>>>     call DMSetMatType(daDummy,MATSEQAIJVIENNACL,ierr)
>>>>>>     call DMSetVecType(daDummy,VECSEQVIENNACL,ierr)
>>>>>>     call MatSetType(A,MATSEQAIJVIENNACL,ierr)
>>>>>>     print*,'SETTING GPU TYPES'
>>>>>> #else
>>>>>>     call DMSetMatType(daDummy,MATSEQAIJ,ierr)
>>>>>>     call DMSetMatType(daDummy,VECSEQ,ierr)
>>>>>>     call MatSetType(A,MATSEQAIJ,ierr)
>>>>>> #endif
>>>>>> call MatSetUp(A,ierr)
>>>>>> call getCenterInfo(daGrid,xstart,ystart,zstart,xend,yend,zend)
>>>>>>
>>>>>
>>>>>
>>>>>> do k=zstart,zend-1
>>>>>>     do j=ystart,yend-1
>>>>>>         do i=xstart,xend-1
>>>>>> [..]
>>>>>>            call
>>>>>> MatSetValues(A,1,row,sumpos,pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
>>>>>> [..]
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>>>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180815/e8df7b06/attachment-0001.html>


More information about the petsc-users mailing list