[petsc-users] MatSetValues error with ViennaCL types

Matthew Knepley knepley at gmail.com
Wed Aug 15 19:30:04 CDT 2018


On Wed, Aug 15, 2018 at 5:45 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:

> Ok thanks for clarifying that, i wasn't sure if there were different types,
>
> Here is a stripped down version of my code, it seems like the
> preallocation is working now since the matrix population part is working
> without problem, but here it is for illustration purposes:
>

Good.


> call DMSetMatrixPreallocateOnly(daDummy,PETSC_TRUE,ierr)
>> call DMCreateMatrix(daDummy,A,ierr)
>> call MatSetFromOptions(A,ierr)
>> call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
>> call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
>>
>
The two statements above should be unnecessary with the command line
options you have.


> call
>> MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,PETSC_NULL_INTEGER,ierr)
>>
>
This should not be necessary since the matrix you get from DMCreateMatrix()
should be preallocated.


> call MatSetUp(A,ierr)
>> [...]
>>             call
>> MatSetValues(A,1,row,sumpos,pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
>> [...]
>> call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>> call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>
>
> Adding the first line there did the trick,
>
> Now the problem seems to be the program is not recognizing the matrix as
> ViennaCL type when i try with more than one processor, i get now:
>

This is a bad error message since it does not tell us the type of the
matrix. You could look in the debugger to see what type the
matrix is. I would get rid of all the extraneous statements above so we can
figure out what is happening.

   Matt


> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: No support for this operation for this object type
> [0]PETSC ERROR: Currently only handles ViennaCL matrices
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.9.2-549-g779ab53  GIT
> Date: 2018-05-31 17:31:13 +0300
> [0]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by valera Wed
> Aug 15 14:44:22 2018
> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64 --download-viennacl
> [0]PETSC ERROR: #1 PCSetUp_SAVIENNACL() line 47 in
> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
> [0]PETSC ERROR: #2 PCSetUp() line 932 in
> /home/valera/petsc/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #3 KSPSetUp() line 381 in
> /home/valera/petsc/src/ksp/ksp/interface/itfunc.c
>
> When running with:
>
> mpirun -n 1 ./gcmLEP.GPU tc=TestCases/LockRelease/LE_6x6x6/
> jid=tiny_cuda_test_n2 -ksp_type cg -dm_vec_type viennacl -dm_mat_type
> aijviennacl -pc_type saviennacl -log_view
>
>
> Thanks,
>
>
>
>
>
>
>
>
>
>
> On Wed, Aug 15, 2018 at 2:32 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Wed, Aug 15, 2018 at 5:20 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:
>>
>>> It seems to be resumed on: I do not know how to preallocate a DM Matrix
>>> correctly.
>>>
>>
>> There is only one matrix type, Mat. There are no separate DM matrices. A
>> DM can create a matrix for you
>> using DMCreateMatrix(), but that is a Mat and it is preallocated
>> correctly. I am not sure what you are doing.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> The interesting part is that it only breaks when i need to populate a
>>> GPU matrix from MPI, so kudos on that, but it seems i need to do better on
>>> my code to get this setup working,
>>>
>>> Any help would be appreciated,
>>>
>>> Thanks,
>>>
>>>
>>>
>>> On Wed, Aug 15, 2018 at 2:15 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Wed, Aug 15, 2018 at 4:53 PM Manuel Valera <mvalera-w at sdsu.edu>
>>>> wrote:
>>>>
>>>>> Thanks Matthew,
>>>>>
>>>>> I try to do that when calling:
>>>>>
>>>>> call MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,
>>>>> PETSC_NULL_INTEGER,ierr)
>>>>>
>>>>> But i am not aware on how to do this for the DM if it needs something
>>>>> more specific/different,
>>>>>
>>>>
>>>> The error says that your preallocation is wrong for the values you are
>>>> putting in. The DM does not control either,
>>>> so I do not understand your email.
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> Thanks,
>>>>>
>>>>> On Wed, Aug 15, 2018 at 1:51 PM, Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Wed, Aug 15, 2018 at 4:39 PM Manuel Valera <mvalera-w at sdsu.edu>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello PETSc devs,
>>>>>>>
>>>>>>> I am running into an error when trying to use the MATMPIAIJVIENNACL
>>>>>>> Matrix type in MPI calls, the same code runs for MATSEQAIJVIENNACL type in
>>>>>>> one processor. The error happens when calling MatSetValues for this
>>>>>>> specific configuration. It does not occur when using MPI DMMatrix types
>>>>>>> only.
>>>>>>>
>>>>>>
>>>>>> The DM properly preallocates the matrix. I am assuming you do not
>>>>>> here.
>>>>>>
>>>>>>    Matt
>>>>>>
>>>>>>
>>>>>>> Any help will be appreciated,
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> My program call:
>>>>>>>
>>>>>>> mpirun -n 2 ./gcmLEP.GPU tc=TestCases/LockRelease/LE_6x6x6/
>>>>>>> jid=tiny_cuda_test_n2 -ksp_type cg -dm_vec_type viennacl -dm_mat_type
>>>>>>> aijviennacl -pc_type saviennacl -log_view
>>>>>>>
>>>>>>>
>>>>>>> The error (repeats after each MatSetValues call):
>>>>>>>
>>>>>>> [1]PETSC ERROR: --------------------- Error Message
>>>>>>> --------------------------------------------------------------
>>>>>>> [1]PETSC ERROR: Argument out of range
>>>>>>> [1]PETSC ERROR: Inserting a new nonzero at global row/column (75,
>>>>>>> 50) into matrix
>>>>>>> [1]PETSC ERROR: See
>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>>>> shooting.
>>>>>>> [1]PETSC ERROR: Petsc Development GIT revision: v3.9.2-549-g779ab53
>>>>>>> GIT Date: 2018-05-31 17:31:13 +0300
>>>>>>> [1]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by valera
>>>>>>> Wed Aug 15 13:10:44 2018
>>>>>>> [1]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
>>>>>>> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
>>>>>>> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
>>>>>>> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64 --download-viennacl
>>>>>>> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 608 in
>>>>>>> /home/valera/petsc/src/mat/impls/aij/mpi/mpiaij.c
>>>>>>> [1]PETSC ERROR: #2 MatSetValues() line 1339 in
>>>>>>> /home/valera/petsc/src/mat/interface/matrix.c
>>>>>>>
>>>>>>>
>>>>>>> My Code structure:
>>>>>>>
>>>>>>> call DMCreateMatrix(daDummy,A,ierr)
>>>>>>>> call MatSetFromOptions(A,ierr)
>>>>>>>> call MPI_Comm_size(PETSC_COMM_WORLD, numprocs, ierr)
>>>>>>>> if (numprocs > 1) then  ! set matrix type parallel
>>>>>>>>     ! Get local size
>>>>>>>>     call DMDACreateNaturalVector(daDummy,Tmpnat,ierr)
>>>>>>>>     call VecGetLocalSize(Tmpnat,locsize,ierr)
>>>>>>>>     call VecDestroy(Tmpnat,ierr)
>>>>>>>>     ! Set matrix
>>>>>>>> #ifdef GPU
>>>>>>>>     call MatSetType(A,MATAIJVIENNACL,ierr)
>>>>>>>>     call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
>>>>>>>>     call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
>>>>>>>>     print*,'SETTING GPU TYPES'
>>>>>>>> #else
>>>>>>>>     call DMSetMatType(daDummy,MATMPIAIJ,ierr)
>>>>>>>>     call DMSetMatType(daDummy,VECMPI,ierr)
>>>>>>>>     call MatSetType(A,MATMPIAIJ,ierr)!
>>>>>>>> #endif
>>>>>>>>     call
>>>>>>>> MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,PETSC_NULL_INTEGER,ierr)
>>>>>>>> else                    ! set matrix type sequential
>>>>>>>> #ifdef GPU
>>>>>>>>     call DMSetMatType(daDummy,MATSEQAIJVIENNACL,ierr)
>>>>>>>>     call DMSetVecType(daDummy,VECSEQVIENNACL,ierr)
>>>>>>>>     call MatSetType(A,MATSEQAIJVIENNACL,ierr)
>>>>>>>>     print*,'SETTING GPU TYPES'
>>>>>>>> #else
>>>>>>>>     call DMSetMatType(daDummy,MATSEQAIJ,ierr)
>>>>>>>>     call DMSetMatType(daDummy,VECSEQ,ierr)
>>>>>>>>     call MatSetType(A,MATSEQAIJ,ierr)
>>>>>>>> #endif
>>>>>>>> call MatSetUp(A,ierr)
>>>>>>>> call getCenterInfo(daGrid,xstart,ystart,zstart,xend,yend,zend)
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>> do k=zstart,zend-1
>>>>>>>>     do j=ystart,yend-1
>>>>>>>>         do i=xstart,xend-1
>>>>>>>> [..]
>>>>>>>>            call
>>>>>>>> MatSetValues(A,1,row,sumpos,pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
>>>>>>>> [..]
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.caam.rice.edu/~mk51/>
>>>>>>
>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>>>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180815/7f5b8d6e/attachment-0001.html>


More information about the petsc-users mailing list