[petsc-dev] [petsc-users] Ordering of preallocation and OwnershipRange

Matthew Knepley knepley at gmail.com
Sat Aug 13 23:44:15 CDT 2011


On Sat, Aug 13, 2011 at 10:23 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   The PetscLayout object is used to manage default layout among processes
> so you can create a PetscLayout object and use that to determine how the Mat
> and Vec objects will be laid out. From the manual page
>

Okay, are we recommending that users use PetscLayout now?

   Matt


>    PetscLayoutCreate - Allocates PetscLayout space and sets the map
> contents to the default.
>
>    Collective on MPI_Comm
>
>   Input Parameters:
> +    comm - the MPI communicator
> -    map - pointer to the map
>
>   Level: developer
>
>    Notes: Typical calling sequence
>       PetscLayoutCreate(MPI_Comm,PetscLayout *);
>       PetscLayoutSetBlockSize(PetscLayout,1);
>       PetscLayoutSetSize(PetscLayout,n) or
> PetscLayoutSetLocalSize(PetscLayout,N);
>       PetscLayoutSetUp(PetscLayout);
>       PetscLayoutGetSize(PetscLayout,PetscInt *); or
> PetscLayoutGetLocalSize(PetscLayout,PetscInt *;)
>       PetscLayoutDestroy(PetscLayout);
>
>      The PetscLayout object and methods are intended to be used in the
> PETSc Vec and Mat implementions; it is
>      recommended they not be used in user codes unless you really gain
> something in their use.
>
>    Fortran Notes:
>      Not available from Fortran
>
>
> On Aug 13, 2011, at 10:28 AM, Matthew Knepley wrote:
>
> > On Sat, Aug 13, 2011 at 2:57 PM, Josh Hykes <jmhykes at ncsu.edu> wrote:
> > Hello,
> >
> > I'm just starting to experiment with PETSc (v3.1), and I like the Python
> bindings provided by petsc4py (v1.1.2). So far things seem fairly
> straightforward, but I'm stumped on a small issue.
> >
> > While creating a parallel AIJ matrix, I'd like to preallocate it using
> arrays d_nnz and o_nnz. As I understand it, these arrays correspond to the
> processor's local rows.
> >
> > Currently I specify the global matrix size, and let PETSc decide on the
> decomposition of the rows. I'd like to ask PETSc what rows each processor
> has with the getOwnershipRange() function, and then do the preallocation.
> However, according to the error message
> >
> > > [1] MatAnyAIJSetPreallocation() line 393 in
> petsc4py-1.1.2/src/include/custom.h
> > > [1] Operation done in wrong order
> > > [1] matrix is already preallocated
> >
> > I'm not allowed to do it in this order.
> >
> > Thus, my question is: is it possible to let PETSc figure out the row
> decomposition while still using d_nnz and o_nnz for the preallocation? I
> figure that I could resolve the problem by doing my own decomposition, but
> it'd be nice if I could let those details up to PETSc.
> >
> > You are correct. We require that preallocation is done at the same time
> as decomposition. There
> > are tricky dependencies in matrix creation. However, an easy workaround
> is to create a Vec at
> > the same time with the same global size, since it is guaranteed to have
> the same layout. I will look
> > into simplifying this if it is possible.
> >
> >   Thanks,
> >
> >      Matt
> >
> > I'm including an example using petsc4py of what I'd like to do, run with
> 2 MPI processes.
> >
> > I apologize if this is a dumb question. Thank you for your help.
> >
> > -Josh
> >
> > # -----------------------------------------------
> > from petsc4py import PETSc as petsc
> >
> > M, N = 4, 6
> >
> > global_d_nnz = [2, 1, 1, 2]
> > global_o_nnz = [1, 3, 2, 1]
> >
> > A = petsc.Mat()
> > A.create(petsc.COMM_WORLD)
> > A.setSizes([M, N])
> > A.setType('aij')
> >
> > i_start, i_end = A.getOwnershipRange()
> >
> > A.setPreallocationNNZ([global_d_nnz[i_start:i_end],
> >                        global_o_nnz[i_start:i_end]]) # error occurs here
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110814/fd4faece/attachment.html>


More information about the petsc-dev mailing list