[petsc-users] MatSetValues error with ViennaCL types

Manuel Valera mvalera-w at sdsu.edu
Tue Aug 28 18:21:28 CDT 2018


Ok, i found the culprit and we can close this thread,

The problem was a missing variable for setting the maximum columns, which i
deleted at some point without realizing. The error message was too
ambiguous to catch this so i had to compare with a previous working version
of the arguments of MatSetValues, it was evident then.

Good news is that i can now set the values with the viennacl types too,

Thanks for your kind help,

Manuel

On Tue, Aug 28, 2018 at 11:25 AM, Smith, Barry F. <bsmith at mcs.anl.gov>
wrote:

>
>    1) PetscMalloc() is never valid or needed in Fortran
>
>     2) there is no reason to use DMSetMatrixPreallocateOnly() just use
> DMCreateMatrix() assuming that using a DM (DMDA, DMPLEX, etc) is suitable
> for your problem.
>
>     At this end we are totally guessing at what you are doing and so have
> little help we can provide.  A simple, nonworking code that tries to do
> what you would need would help us a great deal in understanding that you
> are trying to do.
>
>      Barry
>
>
>
>
>
> > On Aug 28, 2018, at 1:18 PM, Manuel Valera <mvalera-w at sdsu.edu> wrote:
> >
> > Matthew, PetscMalloc gives the same error,
> >
> > Barry, it would be very hard for me to get the code to a minimum working
> example, i guess all i need to understand is how to setup a DM matrix with
> DMSetMatrixPreallocateOnly() instead of MatMPIAIJSetPreallocation() as we
> were doing before, is there a simple example who does this in Fortran?
> >
> > Is the PetscMalloc call needed? is 'call PetscMalloc(1,row,ierr)' a
> valid, compilable call to PetscMalloc? what other reason may there be for
> this error to happen ?
> >
> > Just remembering, that trying to setup the matrix with the
> MatAIJSetPreallocation() brings up an error to acknowledge the viennacl
> datatypes and that's why i'm trying to make this change on your
> recommendation,
> >
> > Thanks for your help,
> >
> >
> >
> >
> >
> > On Mon, Aug 27, 2018 at 7:35 PM, Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> >
> >    Send your code in a way we can compile and run it; it must be some
> simple issue that is hard to communicate in email.
> >
> >    Barry
> >
> >
> > > On Aug 27, 2018, at 5:51 PM, Manuel Valera <mvalera-w at sdsu.edu> wrote:
> > >
> > > Hello everyone,
> > >
> > > I just had time to work on this again, and checked the code for errors
> on the matrix entries, this is the exact code i was using for creating the
> matrix without DMSetMatrixPreallocateOnly, using MatMPIAIJSetPreallocation
> and it worked that way, but trying this way i get the same 'Column too
> large' error using any number at the column position of MatSetValues,
> > >
> > > I have set up my code to print the column argument (n) of MatSetValues
> and in this case is 7 (lower than 124), it still gives error, even entering
> a specific number in the MatSetValues column argument position gives the
> same error.
> > >
> > > So next i went back to ex.50 here: http://www.mcs.anl.gov/petsc/
> petsc-current/src/ts/examples/tutorials/ex50.c.html and it has a very
> similar structure except the PetscMalloc1() call, so i tried adding that
> and got:
> > >
> > >  /home/valera/ParGCCOM/Src/DMDALaplacian.f90:114: undefined reference
> to `petscmalloc1_'
> > >
> > > Any ideas on this behaviour?
> > >
> > > Thanks so much,
> > >
> > >
> > >
> > >
> > >
> > >
> > > On Thu, Aug 16, 2018 at 11:20 AM, Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> > >
> > > Column too large: col 10980 max 124
> > >
> > >    You need to check the code that is generating the matrix entries.
> The matrix has 124 columns but you are attempting to put a value at column
> 10980
> > >
> > >    Barry
> > >
> > >
> > > > On Aug 15, 2018, at 9:44 PM, Manuel Valera <mvalera-w at sdsu.edu>
> wrote:
> > > >
> > > > Thanks Matthew and Barry,
> > > >
> > > > Now my code looks like:
> > > >
> > > > call DMSetMatrixPreallocateOnly(daDummy,PETSC_TRUE,ierr)
> > > > call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
> > > > call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
> > > > call DMCreateMatrix(daDummy,A,ierr)
> > > > call MatSetFromOptions(A,ierr)
> > > > call MatSetUp(A,ierr)
> > > > [...]
> > > >             call MatSetValues(A,1,row,sumpos,
> pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
> > > > [...]
> > > > call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
> > > > call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
> > > >
> > > > And i get a different error, now is:
> > > >
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Argument out of range
> > > > [0]PETSC ERROR: Column too large: col 10980 max 124
> > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Development GIT revision: v3.9.2-549-g779ab53
> GIT Date: 2018-05-31 17:31:13 +0300
> > > > [0]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by valera
> Wed Aug 15 19:40:00 2018
> > > > [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64
> --download-viennacl
> > > > [0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 442 in
> /home/valera/petsc/src/mat/impls/aij/seq/aij.c
> > > > [0]PETSC ERROR: #2 MatSetValues() line 1339 in
> /home/valera/petsc/src/mat/interface/matrix.c
> > > >
> > > >
> > > > Thanks again,
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > On Wed, Aug 15, 2018 at 7:02 PM, Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> > > >
> > > >   Should be
> > > >
> > > > call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
> > > > call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
> > > > call DMCreateMatrix(daDummy,A,ierr)
> > > >
> > > >   and remove the rest. You need to set the type of Mat you want the
> DM to return BEFORE you create the matrix.
> > > >
> > > >   Barry
> > > >
> > > >
> > > >
> > > > > On Aug 15, 2018, at 4:45 PM, Manuel Valera <mvalera-w at sdsu.edu>
> wrote:
> > > > >
> > > > > Ok thanks for clarifying that, i wasn't sure if there were
> different types,
> > > > >
> > > > > Here is a stripped down version of my code, it seems like the
> preallocation is working now since the matrix population part is working
> without problem, but here it is for illustration purposes:
> > > > >
> > > > > call DMSetMatrixPreallocateOnly(daDummy,PETSC_TRUE,ierr)
> > > > > call DMCreateMatrix(daDummy,A,ierr)
> > > > > call MatSetFromOptions(A,ierr)
> > > > > call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
> > > > > call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
> > > > > call MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,
> PETSC_NULL_INTEGER,ierr)
> > > > > call MatSetUp(A,ierr)
> > > > > [...]
> > > > >             call MatSetValues(A,1,row,sumpos,
> pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
> > > > > [...]
> > > > > call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
> > > > > call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
> > > > >
> > > > > Adding the first line there did the trick,
> > > > >
> > > > > Now the problem seems to be the program is not recognizing the
> matrix as ViennaCL type when i try with more than one processor, i get now:
> > > > >
> > > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > > [0]PETSC ERROR: No support for this operation for this object type
> > > > > [0]PETSC ERROR: Currently only handles ViennaCL matrices
> > > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > > [0]PETSC ERROR: Petsc Development GIT revision:
> v3.9.2-549-g779ab53  GIT Date: 2018-05-31 17:31:13 +0300
> > > > > [0]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by
> valera Wed Aug 15 14:44:22 2018
> > > > > [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64
> --download-viennacl
> > > > > [0]PETSC ERROR: #1 PCSetUp_SAVIENNACL() line 47 in
> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
> > > > > [0]PETSC ERROR: #2 PCSetUp() line 932 in
> /home/valera/petsc/src/ksp/pc/interface/precon.c
> > > > > [0]PETSC ERROR: #3 KSPSetUp() line 381 in
> /home/valera/petsc/src/ksp/ksp/interface/itfunc.c
> > > > >
> > > > > When running with:
> > > > >
> > > > > mpirun -n 1 ./gcmLEP.GPU tc=TestCases/LockRelease/LE_6x6x6/
> jid=tiny_cuda_test_n2 -ksp_type cg -dm_vec_type viennacl -dm_mat_type
> aijviennacl -pc_type saviennacl -log_view
> > > > >
> > > > >
> > > > > Thanks,
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Wed, Aug 15, 2018 at 2:32 PM, Matthew Knepley <
> knepley at gmail.com> wrote:
> > > > > On Wed, Aug 15, 2018 at 5:20 PM Manuel Valera <mvalera-w at sdsu.edu>
> wrote:
> > > > > It seems to be resumed on: I do not know how to preallocate a DM
> Matrix correctly.
> > > > >
> > > > > There is only one matrix type, Mat. There are no separate DM
> matrices. A DM can create a matrix for you
> > > > > using DMCreateMatrix(), but that is a Mat and it is preallocated
> correctly. I am not sure what you are doing.
> > > > >
> > > > >   Thanks,
> > > > >
> > > > >     Matt
> > > > >
> > > > > The interesting part is that it only breaks when i need to
> populate a GPU matrix from MPI, so kudos on that, but it seems i need to do
> better on my code to get this setup working,
> > > > >
> > > > > Any help would be appreciated,
> > > > >
> > > > > Thanks,
> > > > >
> > > > >
> > > > >
> > > > > On Wed, Aug 15, 2018 at 2:15 PM, Matthew Knepley <
> knepley at gmail.com> wrote:
> > > > > On Wed, Aug 15, 2018 at 4:53 PM Manuel Valera <mvalera-w at sdsu.edu>
> wrote:
> > > > > Thanks Matthew,
> > > > >
> > > > > I try to do that when calling:
> > > > >
> > > > > call MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,
> PETSC_NULL_INTEGER,ierr)
> > > > >
> > > > > But i am not aware on how to do this for the DM if it needs
> something more specific/different,
> > > > >
> > > > > The error says that your preallocation is wrong for the values you
> are putting in. The DM does not control either,
> > > > > so I do not understand your email.
> > > > >
> > > > >   Thanks,
> > > > >
> > > > >      Matt
> > > > >
> > > > > Thanks,
> > > > >
> > > > > On Wed, Aug 15, 2018 at 1:51 PM, Matthew Knepley <
> knepley at gmail.com> wrote:
> > > > > On Wed, Aug 15, 2018 at 4:39 PM Manuel Valera <mvalera-w at sdsu.edu>
> wrote:
> > > > > Hello PETSc devs,
> > > > >
> > > > > I am running into an error when trying to use the
> MATMPIAIJVIENNACL Matrix type in MPI calls, the same code runs for
> MATSEQAIJVIENNACL type in one processor. The error happens when calling
> MatSetValues for this specific configuration. It does not occur when using
> MPI DMMatrix types only.
> > > > >
> > > > > The DM properly preallocates the matrix. I am assuming you do not
> here.
> > > > >
> > > > >    Matt
> > > > >
> > > > > Any help will be appreciated,
> > > > >
> > > > > Thanks,
> > > > >
> > > > >
> > > > >
> > > > > My program call:
> > > > >
> > > > > mpirun -n 2 ./gcmLEP.GPU tc=TestCases/LockRelease/LE_6x6x6/
> jid=tiny_cuda_test_n2 -ksp_type cg -dm_vec_type viennacl -dm_mat_type
> aijviennacl -pc_type saviennacl -log_view
> > > > >
> > > > >
> > > > > The error (repeats after each MatSetValues call):
> > > > >
> > > > > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > > [1]PETSC ERROR: Argument out of range
> > > > > [1]PETSC ERROR: Inserting a new nonzero at global row/column (75,
> 50) into matrix
> > > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > > [1]PETSC ERROR: Petsc Development GIT revision:
> v3.9.2-549-g779ab53  GIT Date: 2018-05-31 17:31:13 +0300
> > > > > [1]PETSC ERROR: ./gcmLEP.GPU on a cuda-debug named node50 by
> valera Wed Aug 15 13:10:44 2018
> > > > > [1]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug
> --with-mpi-dir=/usr/lib64/openmpi --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2
> --FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1
> --CUDAFLAGS=-arch=sm_60 --with-blaslapack-dir=/usr/lib64
> --download-viennacl
> > > > > [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 608 in
> /home/valera/petsc/src/mat/impls/aij/mpi/mpiaij.c
> > > > > [1]PETSC ERROR: #2 MatSetValues() line 1339 in
> /home/valera/petsc/src/mat/interface/matrix.c
> > > > >
> > > > >
> > > > > My Code structure:
> > > > >
> > > > > call DMCreateMatrix(daDummy,A,ierr)
> > > > > call MatSetFromOptions(A,ierr)
> > > > > call MPI_Comm_size(PETSC_COMM_WORLD, numprocs, ierr)
> > > > > if (numprocs > 1) then  ! set matrix type parallel
> > > > >     ! Get local size
> > > > >     call DMDACreateNaturalVector(daDummy,Tmpnat,ierr)
> > > > >     call VecGetLocalSize(Tmpnat,locsize,ierr)
> > > > >     call VecDestroy(Tmpnat,ierr)
> > > > >     ! Set matrix
> > > > > #ifdef GPU
> > > > >     call MatSetType(A,MATAIJVIENNACL,ierr)
> > > > >     call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr)
> > > > >     call DMSetVecType(daDummy,VECMPIVIENNACL,ierr)
> > > > >     print*,'SETTING GPU TYPES'
> > > > > #else
> > > > >     call DMSetMatType(daDummy,MATMPIAIJ,ierr)
> > > > >     call DMSetMatType(daDummy,VECMPI,ierr)
> > > > >     call MatSetType(A,MATMPIAIJ,ierr)!
> > > > > #endif
> > > > >     call MatMPIAIJSetPreallocation(A,19,PETSC_NULL_INTEGER,19,
> PETSC_NULL_INTEGER,ierr)
> > > > > else                    ! set matrix type sequential
> > > > > #ifdef GPU
> > > > >     call DMSetMatType(daDummy,MATSEQAIJVIENNACL,ierr)
> > > > >     call DMSetVecType(daDummy,VECSEQVIENNACL,ierr)
> > > > >     call MatSetType(A,MATSEQAIJVIENNACL,ierr)
> > > > >     print*,'SETTING GPU TYPES'
> > > > > #else
> > > > >     call DMSetMatType(daDummy,MATSEQAIJ,ierr)
> > > > >     call DMSetMatType(daDummy,VECSEQ,ierr)
> > > > >     call MatSetType(A,MATSEQAIJ,ierr)
> > > > > #endif
> > > > > call MatSetUp(A,ierr)
> > > > > call getCenterInfo(daGrid,xstart,ystart,zstart,xend,yend,zend)
> > > > >
> > > > > do k=zstart,zend-1
> > > > >     do j=ystart,yend-1
> > > > >         do i=xstart,xend-1
> > > > > [..]
> > > > >            call MatSetValues(A,1,row,sumpos,
> pos(0:iter-1),vals(0:iter-1),INSERT_VALUES,ierr)
> > > > > [..]
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > > > > -- Norbert Wiener
> > > > >
> > > > > https://www.cse.buffalo.edu/~knepley/
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > > > > -- Norbert Wiener
> > > > >
> > > > > https://www.cse.buffalo.edu/~knepley/
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > > > > -- Norbert Wiener
> > > > >
> > > > > https://www.cse.buffalo.edu/~knepley/
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180828/47c0c351/attachment-0001.html>


More information about the petsc-users mailing list