CSR Sparse Matrix Query

Matthew Knepley knepley at gmail.com
Tue Apr 24 07:09:50 CDT 2007


On 4/24/07, Dr. Timothy Stitt <timothy.stitt at ichec.ie> wrote:
> Hi everyone...again.
>
> Thanks to the great support I have now got my solver working (in
> Fortran) albeit in a sequential non-distributed version. My problem is
> that I planned to substitute the call to MatCreateSeqAIJWithArrays()
> with a call to MatCreateMPIAIJWithArrays() (along with suitable
> modification to the arguments) to obtain a parallel version.
>
> Anyhow, I have come upon a few problems() unfortunately. Firstly here
> are the code segments I am trying to update (my sparse matrix is square
> with global dimensions [rows,rows], size is returned by MPI_COMM_SIZE()):
>
> ---------------------------------------------------------------------------------------------------------------
> call MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,rows,rows,&
>         csrRowPointers,csrColumnIndices,csrElements,A,ierr)
> ----------------------------------------------------------------------------------------------------------------
>
> TO
>
> ---------------------------------------------------------------------------------------------------------------
> call
> MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,rows/size,rows/size,rows,rows,&
>         csrRowPointers,csrColumnIndices,csrElements,A,ierr)
> ----------------------------------------------------------------------------------------------------------------
>
> 1. Firstly, when I compile the code I keep getting:
>
>  solver.o(.text+0x202): In function `MAIN__':
> : undefined reference to `matcreatempiaijwitharrays_'

There is no Fortran interface for this function (since no one had ever asked).

> 2. Do the arguments to the routine look correct ? I would prefer for
> PETSc to determine as many of the arguments as possible but I wasn't
> sure which ones to supply explicitly or allow PETSc to infer implicitly.

No, I think you are going to have some problems here. This function is intended
for people who already have a parallel CSR matrix constructed. From your
discussion above, this does not appear to be the case. I see several pitfalls:

  a) size may not evenly divide rows

  b) the arrays must be only for rows local to the process, but contain global
      indices

I would encourage you to follow the steps that Randy outlined, namely create
the matrix (letting PETSc partition the rows), preallocate the storage by
determining local row lengths, and the setting values with MatSetValues for
each row.

  Thanks,

     Matt

> 3. When constructing the operator matrix is it o.k. to build it only on
> node with rank 0 or does it have to be duplicated across all nodes
> before the call to the MatCreateMPIxxxxx() routine? The matrix is very
> large and hence wouldn't be suitable for replication.
>
> I hope this is the last of my worries. Thanks again for your support and
> patience.
>
> Regards,
>
> Tim.
>
>
>
> Satish Balay wrote:
> > On Mon, 23 Apr 2007, Satish Balay wrote:
> >
> >
> >> On Mon, 23 Apr 2007, Dr. Timothy Stitt wrote:
> >>
> >>
> >>> I am using Fortran. Do I have to perform some operation to switch
> >>> from column-major to row-major representation before calling the
> >>> MatCreateSeqAIJWithArrays routine?
> >>>
> >> MatCreateSeqAIJWithArrays expects a 1d array for the values [and
> >> rows,column indicies as well] - so this issue of row-major vs
> >> column-major does not exist.
> >>
> >
> > However the CSR format which is used for the input, is equivalent to a
> > row-major ordering.. i.e for the following matrix:
> >
> > 1 0 0
> > 2 0 3
> > 0 4 5
> >
> > The call expects the following
> >
> > i =  {0,1,3,5}  [size = nrow+1  = 3+1]
> > j =  {0,0,2,1,2}  [size = nz = 5]
> > v =  {1,2,3,4,5}  [size = nz = 5]
> >
> > Satish
> >
> >
>
>
> --
> Dr. Timothy Stitt <timothy_dot_stitt_at_ichec.ie>
> HPC Application Consultant - ICHEC (www.ichec.ie)
>
> Dublin Institute for Advanced Studies
> 5 Merrion Square - Dublin 2 - Ireland
>
> +353-1-6621333 (tel) / +353-1-6621477 (fax) / +353-874195427 (mobile)
>
>


-- 
The government saving money is like me spilling beer. It happens, but
never on purpose.




More information about the petsc-users mailing list