[petsc-users] Set matrix column to vector

Florian Lindner mailinglists at xgm.de
Mon Sep 1 08:48:27 CDT 2014


Am 01.09.2014 15:44, schrieb Florian Lindner:
> Am 01.09.2014 12:45, schrieb Matthew Knepley:
>> On Mon, Sep 1, 2014 at 4:10 AM, Florian Lindner <mailinglists at xgm.de> 
>> wrote:
>> 
>>> Hello,
>>> 
>>> I want to set the entire column of a N x M matrix to a N vector. What 
>>> is
>>> the best way to do that?
>>> 
>>> My first guess would be to VecGetArray and use that array for
>>> MatSetValuesLocal with nrow = VecGetLocalSize. What is the best to 
>>> say
>>> MatSetValuesLocal that I want to set all rows continuesly (same like
>>> passing irow = [0, 1, ..., VecGetLocalSize-1]?
>>> 
>>> Any better way?
>>> 
>> 
>> You are assuming dense storage above, so you can use
>> 
>> 
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatDenseGetArray.html+
> 
> How can you tell that I'm assuming dense storage. My matrix is
> actually dense, but I try to write my code as generic as possible
> (being a very petsc newbie). I have that code which crashes at the
> moment:
> 
>     void set_column_vector(Vector v, int col)
>     {
>       PetscErrorCode ierr = 0;
>       const PetscScalar *vec;
>       PetscInt size, mat_rows, mat_cols;
>       VecGetLocalSize(v.vector, &size);
>       cout << "Vector Size = " << size << endl;
> 
>       MatGetSize(matrix, &mat_rows, &mat_cols);
>       cout << "Matrix Rows  = " << mat_rows << " Columns = " <<
> mat_cols << endl;
>       PetscInt irow[size];
>       for (int i = 0; i < size; i++) {
>         irow[i] = i;
>       }
> 
>       ierr = VecGetArrayRead(v.vector, &vec); CHKERRV(ierr);
>       ierr = MatSetValuesLocal(matrix, size-1, irow, 1, &col, vec,
> INSERT_VALUES); CHKERRV(ierr);

Correction:

ierr = MatSetValuesLocal(matrix, size, irow, 1, &col, vec, 
INSERT_VALUES);

size-1 was just one of my debugging experiments.


>       ierr = VecRestoreArrayRead(v.vector, &vec); CHKERRV(ierr);
>       ierr = MatAssemblyBegin(matrix, MAT_FINAL_ASSEMBLY); 
> CHKERRV(ierr);
>       ierr = MatAssemblyEnd(matrix, MAT_FINAL_ASSEMBLY); CHKERRV(ierr);
>     }
> 
> v.vector is a Vec, matrix is a Mat. col = 1. It's compiled with
> mpic++, but started without, just ./a.out. Output is:
> 
> Vector Size = 20
> Matrix Rows  = 20 Columns = 5
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or 
> -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC
> ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
> find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not 
> available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the 
> function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] MatSetValuesLocal line 1950
> /home/florian/software/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> shooting.
> [0]PETSC ERROR: Petsc Release Version 3.5.1, unknown
> [0]PETSC ERROR: ./a.out on a arch-linux2-c-debug named asaru by
> florian Mon Sep  1 15:37:32 2014
> [0]PETSC ERROR: Configure options --with-c2html=0
> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
> 
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> 
> 
> Any idea what the problem is?
> 
> Thanks!
> Florian


More information about the petsc-users mailing list