[petsc-dev] Using PETSc MatIS, how to matmult a global IS matrix and a global vector ?

Stefano Zampini stefano.zampini at gmail.com
Tue May 23 14:09:16 CDT 2017


Il 23 Mag 2017 6:28 PM, "Franck Houssen" <franck.houssen at inria.fr> ha
scritto:

OK, thanks. This is helpfull... But I really think the doc should be more
verbose about that: this is really confusing and I didn't find any simple
example to begin with which make all this even more confusing (personal
opinion).


The man page of MatCreateIS is  clear to me

http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Mat/MatCreateIS.html#MatCreateIS



Franck


------------------------------

*De: *"Matthew Knepley" <knepley at gmail.com>
*À: *"Franck Houssen" <franck.houssen at inria.fr>
*Cc: *"Stefano Zampini" <stefano.zampini at gmail.com>, "PETSc" <
petsc-users at mcs.anl.gov>, "PETSc" <petsc-dev at mcs.anl.gov>
*Envoyé: *Mardi 23 Mai 2017 13:21:21

*Objet: *Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS
matrix and a global vector ?

On Tue, May 23, 2017 at 4:53 AM, Franck Houssen <franck.houssen at inria.fr>
wrote:

> The first thing I did was to put 3, not 4 : I got an error thrown in
> MatCreateIS (see the git diff + stack below). As the error said I used
> globalSize = numberOfMPIProcessus * localSize : my understanding is that,
> when using MatIS, the global size needs to be the sum of all local sizes.
> Correct ?
>

No. MatIS means that the matrix is not assembled. The easiest way (for me)
to think of this is that processes do not have
to hold full rows. One process can hold part of row i, and another
processes can hold another part. However, there are still
the same number of global rows.

I have a 3x3 global matrix made of two overlapping 2x2 local matrix (=
> diagonal with 1.). Each local matrix correspond to one domain (each domain
> is delegated to one MPI proc, so, I have 2 MPI procs because I have 2
> domains).
>

So the global size is 3. The local size here is not the size of the local
IS block, since that is a property only of MatIS. It is the
size of the local piece of the vector you multiply. This allows PETSc to
understand the parallel layout of the Vec, and how it
matched the Mat.

This is somewhat confusing because FEM people mean something different by
"local" than we do here, and in fact we use this
other definition of local when assembling operators.

   Matt


> This is the simplest possible example: I have two 2x2 (local) diag matrix
> that overlap so that the global matrix built from them is 1, 2, 1 on the
> diagonal (local contributions add up in the middle).
> I need to MatMult this global matrix with a global vector filled with 1.
>
> Franck
>
> Git diff :
>
> --- a/matISLocalMat.cpp
> +++ b/matISLocalMat.cpp
> @@ -16,7 +16,7 @@ int main(int argc,char **argv) {
>    int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size); if (size != 2)
> return 1;
>    int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);
>
> -  PetscInt localSize = 2, globalSize = localSize*2 /*2 MPI*/;
> +  PetscInt localSize = 2, globalSize = 3;
>    PetscInt localIdx[2] = {0, 0};
>    if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}
>    else           {localIdx[0] = 1; localIdx[1] = 2;}
>
>
>
> Stack error:
>
> [0]PETSC ERROR: Nonconforming object sizes
> [0]PETSC ERROR: Sum of local lengths 4 does not equal global length 3, my
> local length 2
> [0]PETSC ERROR: [0] ISG2LMapApply line 17 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/vec/is/utils/isltog.c
> [0]PETSC ERROR: [0] MatSetValues_IS line 692 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: [0] MatSetValues line 1157 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [0]PETSC ERROR: [0] MatISSetPreallocation_IS line 95
> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: [0] MatISSetPreallocation line 80
> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: [0] PetscSplitOwnership line 80 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/sys/utils/psplit.c
> [0]PETSC ERROR: [0] PetscLayoutSetUp line 129 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/vec/is/utils/pmap.c
> [0]PETSC ERROR: [0] MatSetLocalToGlobalMapping_IS line 628
> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: [0] MatSetLocalToGlobalMapping line 1899
> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [0]PETSC ERROR: [0] MatCreateIS line 986 /home/fghoussen/Documents/
> INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
>
>
>
> ------------------------------
>
> *De: *"Stefano Zampini" <stefano.zampini at gmail.com>
> *À: *"Matthew Knepley" <knepley at gmail.com>
> *Cc: *"Franck Houssen" <franck.houssen at inria.fr>, "PETSc" <
> petsc-users at mcs.anl.gov>, "PETSc" <petsc-dev at mcs.anl.gov>
> *Envoyé: *Dimanche 21 Mai 2017 23:02:37
> *Objet: *Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS
> matrix and a global vector ?
>
> Franck,
>
> PETSc takes care of doing the matrix-vector multiplication properly using
> MatIS.  As Matt said, the layout of the vectors is the usual parallel
> layout.
> The local sizes of the MatIS matrix (i.e. the local size of the left and
> right vectors used in MatMult) are not the sizes of the local subdomain
>  matrices in MatIS.
>
>
> On May 21, 2017, at 6:47 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Sun, May 21, 2017 at 11:26 AM, Franck Houssen <franck.houssen at inria.fr>
> wrote:
>
>> Using PETSc MatIS, how to matmult a global IS matrix and a global vector
>> ? Example is attached : I don't get what I expect that is a vector such
>> that proc0 = [1, 2] and proc1 = [2, 1]
>>
>
> 1) I think the global size of your matrix is wrong. You seem to want 3,
> not 4
>
> 2) Global vectors have a non-overlapping row partition. You might be
> thinking of local vectors
>
>   Thanks,
>
>     Matt
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> http://www.caam.rice.edu/~mk51/
>
>
> ------------------------------
>
> *De: *"Stefano Zampini" <stefano.zampini at gmail.com>
> *À: *"Matthew Knepley" <knepley at gmail.com>
> *Cc: *"Franck Houssen" <franck.houssen at inria.fr>, "PETSc" <
> petsc-users at mcs.anl.gov>, "PETSc" <petsc-dev at mcs.anl.gov>
> *Envoyé: *Dimanche 21 Mai 2017 23:02:37
> *Objet: *Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS
> matrix and a global vector ?
>
> Franck,
>
> PETSc takes care of doing the matrix-vector multiplication properly using
> MatIS.  As Matt said, the layout of the vectors is the usual parallel
> layout.
> The local sizes of the MatIS matrix (i.e. the local size of the left and
> right vectors used in MatMult) are not the sizes of the local subdomain
>  matrices in MatIS.
>
>
> On May 21, 2017, at 6:47 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Sun, May 21, 2017 at 11:26 AM, Franck Houssen <franck.houssen at inria.fr>
> wrote:
>
>> Using PETSc MatIS, how to matmult a global IS matrix and a global vector
>> ? Example is attached : I don't get what I expect that is a vector such
>> that proc0 = [1, 2] and proc1 = [2, 1]
>>
>
> 1) I think the global size of your matrix is wrong. You seem to want 3,
> not 4
>
> 2) Global vectors have a non-overlapping row partition. You might be
> thinking of local vectors
>
>   Thanks,
>
>     Matt
>
>
>> Franck
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> http://www.caam.rice.edu/~mk51/
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

http://www.caam.rice.edu/~mk51/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170523/0763f32b/attachment.html>


More information about the petsc-dev mailing list