CSR Sparse Matrix Query
Dr. Timothy Stitt
timothy.stitt at ichec.ie
Mon Apr 23 03:48:47 CDT 2007
Many thanks to all who replied to my original posting. I believe I have
now enough excellent advice to solve the problem. Thanks Mads also for
the final tip which could have been easily overlooked.
Thanks all.
Mads Fredrik Skoge Hoel wrote:
> I was just dealing with this a few moments ago. Remember to make sure the
> row and col array start at 0 (and stop at nnz and n-1 respectively). I
> just did that mistake.
>
> Regards,
> Mads
>
>
>> You can also use MatCreateSeqAIJWithArrays() and in parallel
>> MatCreateMPIAIJWithArrays(). These are intended for those who already
>> form the matrix in CSR format.
>>
>> Barry
>>
>>
>> On Sat, 21 Apr 2007, Randall Mackie wrote:
>>
>>
>>> Hi Tim,
>>>
>>> There's no need to generate a dense matrix if you're dealing with sparse
>>> matrices. There are many approaches, but if you just want to get your
>>> existing matrix into PETSc, you can use something like the following
>>> (written
>>> in Fortran, but almost the same in C):
>>>
>>> Assume the global vector length is np:
>>>
>>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,np,b,ierr)
>>> call VecDuplicate(b,xsol,ierr)
>>> call VecGetLocalSize(b,mloc,ierr)
>>> call VecGetOwnershipRange(b,Istart,Iend,ierr)
>>>
>>> do i=Istart+1,Iend
>>> loc(i)=i-1
>>> end do
>>>
>>>
>>> ALLOCATE (d_nnz(mloc), o_nnz(mloc), STAT=istat)
>>>
>>> Note: the d_nnz and o_nnz are used to determine the pre-allocation of
>>> the
>>> global matrix
>>> according to the PETSc manual (basically just looping through and
>>> figuring out
>>> which
>>> elements are in d_nnz and which are in o_nnz).
>>>
>>> Then you can create the matrix:
>>>
>>>
>>> call MatCreateMPIAIJ(PETSC_COMM_WORLD,mloc,mloc,np,np,
>>> . PETSC_NULL_INTEGER, d_nnz, PETSC_NULL_INTEGER,
>>> . o_nnz,P,ierr)
>>>
>>>
>>> And then I had a subroutine to fill it like this (the critical part is
>>> the
>>> check
>>> whether that part of the matrix is owned by the current processor):
>>>
>>> jj=0
>>>
>>> do i=1,l
>>> do k=2,n
>>> do j=2,m
>>>
>>> jj=jj+1
>>> row = jj-1
>>>
>>> IF (jj >= Istart+1 .and. jj <= Iend) THEN
>>>
>>> if(k.ne.2) then
>>> fact=-rijcy*dx*dy/z(k-1) ! Hxijc
>>> ic=ic+1
>>> v(ic)=fact
>>> col(ic)=ijkhx(i,j,k-1)-1
>>> end if
>>>
>>> [snip code]
>>>
>>> call MatSetValues(A,i1,row,ic,col,v,INSERT_VALUES,
>>> . ierr)
>>>
>>>
>>> [and finally at the end you assemble the matrix]
>>>
>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)
>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)
>>>
>>>
>>> Since these early days, I've switched over to using Distributed Arrays,
>>> which in my opinion are much better and much easier to use.
>>>
>>> Hope this helps,
>>>
>>> Randy
>>>
>>>
>>> Dr. Timothy Stitt wrote:
>>>
>>>> Hi all,
>>>>
>>>> I have just begun to use PETSc and I have already ran into a problem.
>>>> Apologies if this has been dealt with before.
>>>>
>>>> My code currently generates its own non-distributed Hamiltonian sparse
>>>> matrix in CSR format (due to the large size of the dense
>>>>
>>> representation). I
>>>
>>>> want to exploit PETSc's (actually SLEPc's) eigensolver routines and
>>>>
>>> hence I
>>>
>>>> need to convert my CSR representation into PETSc CSR representation
>>>>
>>> for use
>>>
>>>> by the eigensolver.
>>>> From reading the documentation I am under the impression that you
>>>>
>>> initially
>>>
>>>> supply the dense matrix to the PETSc matrix routines for conversion to
>>>>
>>> PETSc
>>>
>>>> CSR representation. In my case I do not generate the dense matrix. I
>>>>
>>> would
>>>
>>>> be grateful if someone could guide me on to how to create a PETSC
>>>>
>>> sparse
>>>
>>>> matrix from my own in-code generated sparse matrix.
>>>>
>>>> Thanks in advance for any assistance that can be provided.
>>>>
>>>> Regards,
>>>>
>>>> Tim.
>>>>
>>>>
>>>
>>
>
>
>
--
Dr. Timothy Stitt <timothy_dot_stitt_at_ichec.ie>
HPC Application Consultant - ICHEC (www.ichec.ie)
Dublin Institute for Advanced Studies
5 Merrion Square - Dublin 2 - Ireland
+353-1-6621333 (tel) / +353-1-6621477 (fax) / +353-874195427 (mobile)
More information about the petsc-users
mailing list