[petsc-users] local dimensions

Massoud Rezavand msdrezavand at gmail.com
Sun Dec 11 18:21:32 CST 2016


Thanks,

as I already discussed with you, the matrix is coming from SPH
discretization, which is not fixed on a grid and is changing over time.

On Mon, Dec 12, 2016 at 1:10 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> > On Dec 11, 2016, at 6:04 PM, Massoud Rezavand <msdrezavand at gmail.com>
> wrote:
> >
> > Thank you very much,
> >
> > So, if I am using PetscSplitOwnership() and then MatGetOwnershipRange()
> to be prepared for preallocation, then MatSetSizes(A, local_size,
> local_size, N, N) should be called with the calculated  local_size from
> PetscSplitOwnership() ?
>
>    Confusion from the two responses. You cannot use MatGetOwnershipRange()
> for preallocation.
>
> Without preallocation:
>
> > > PetscInt local_size = PETSC_DECIDE;
> > >
> > > MatSetSizes(A, local_size, local_size, N, N);
>
>      MatGetOwnershipRanges(...)
>
> With preallocation:
> > >
> > >
> > > 2)
> > >
> > > PetscInt local_size = PETSC_DECIDE;
> > >
> > > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N);
> > >
> > > MPI_Scan(&local_size, &end_row, 1, MPIU_INT, MPI_SUM,
> PETSC_COMM_WORLD);
> > > begin_row = end_row - local_size;
>
>     MatMPIAIJSetPreallocation(.....).
>
>
> But note that normally if the matrix comes from a discretization on a grid
> you would not use either approach above. The parallel layout of the grid
> would determine the local sizes and you won't not obtain them with
> PetscSplitOwnership() or local_size = PETSC_DECIDE;
>
> Where is your matrix coming from?
>
>    Barry
>
>
>
> > >
> > >
>
>
> >
> > Thanks,
> > Massoud
> >
> >
> > On Mon, Dec 12, 2016 at 12:35 AM, Jed Brown <jed at jedbrown.org> wrote:
> > Massoud Rezavand <msdrezavand at gmail.com> writes:
> >
> > > Dear PETSc team,
> > >
> > > What is the difference between the following two methods to get the
> local
> > > dimensions of a square matrix A? If they do the same, which one is
> > > recommended? Should I use MPI_Scan after both?
> >
> > I would typically use 1 because it's fewer calls and automatically uses
> > the correct communicator.  You can use MatGetOwnershipRange() instead of
> > manually using MPI_Scan.
> >
> > > 1)
> > >
> > > PetscInt local_size = PETSC_DECIDE;
> > >
> > > MatSetSizes(A, local_size, local_size, N, N);
> > >
> > >
> > > 2)
> > >
> > > PetscInt local_size = PETSC_DECIDE;
> > >
> > > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N);
> > >
> > > MPI_Scan(&local_size, &end_row, 1, MPIU_INT, MPI_SUM,
> PETSC_COMM_WORLD);
> > > begin_row = end_row - local_size;
> > >
> > >
> > > Thanks in advance,
> > > Massoud
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161212/e018c562/attachment-0001.html>


More information about the petsc-users mailing list