leave my rows alone

Matt Funk mafunk at nmsu.edu
Wed Aug 16 11:53:42 CDT 2006


Hi Thomas.

I am not sure if the following is what you are looking for, but i don't have 
PETSc 'redistribute' anything. That is, i tell PETSc exactly how the matrix 
should be distributed across the procs via the following:

 m_ierr = MatCreateMPIAIJ(PETSC_COMM_WORLD, 
                           a_totallocal_numPoints[a_thisproc], 
                           a_totallocal_numPoints[a_thisproc],
                           a_totalglobal_numPoints, 
                           a_totalglobal_numPoints,
                           PETSC_NULL,
                           a_NumberOfNZPointsInDiagonalMatrix,
                           PETSC_NULL,
                           a_NumberOfNZPointsInOffDiagonalMatrix,
                           &m_globalMatrix);

The argument descriptions are found at 
'http://www-unix.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJ.html'

So anyway, PETSc does not touch this matrix in the sense of redistributing 
anything. It is just as i want it to be. Hope this helps ...

mat


On Wednesday 16 August 2006 10:37, Thomas Geenen wrote:
> On Wednesday 16 August 2006 18:21, Matthew Knepley wrote:
> > On 8/16/06, Thomas Geenen <geenen at gmail.com> wrote:
> > > dear petsc users,
> > >
> > > is there a way to prevent Petsc during the assembly phase from
> > > redistributing matrix rows over cpu's ?? i like the way the rows are
> > > assigned to the cpu's during the setvalues phase.
> >
> > Actually,  the layout of a matrix is fully determined after
> > MatSetSizes(), or equivalently MatCreate***(). We do not redistribute at
> > assembly.
> >
> > setValues() will take values for any row, and send it to the correct-
> > process. The
>
> send it to the correct process sounds a lot like redistributing but that's
> probably a matter of semantics
>
> > matrix layouts we support all have contiguous row on each proc. You can
> > set the sizes on creation.
>
> pity
>
> >   Does this answer your question?
>
> yep
> thanks
>
> >   Thanks,
> >
> >      Matt
> >
> > > apparently petsc assigns the first nrows to cpu0 the second nrows to
> > > cpu1 etc. I could of course renumber my matrix but I would rather
> > > convince petsc that it should keep the distribution of the matrix rows.
> > >
> > > tia
> > > Thomas




More information about the petsc-users mailing list