[petsc-users] MatSetSizes with blocked matrix

Matthew Knepley knepley at gmail.com
Wed Mar 23 19:19:19 CDT 2016


On Wed, Mar 23, 2016 at 7:11 PM, Steena Monteiro <steena.hpc at gmail.com>
wrote:

> Thanks, Barry. The block size compatibility was a probe to investigate an
> error arising when trying to assign unequal numbers of rows across two MPI
> ranks.
>
> To re-visit the initial question, what is going wrong when I try to divide
> rows unequally across MPI ranks using MatSetSize?
>

We definitely require that the blocksize divide the local size. This looks
like a problem with our checks.

   Matt


> For matrix A with rows=cols=1,139,905 and block size = 2,
>
> rank 0 gets 400000 rows and rank 1, 739905 rows
>
> if (!rank) {
>
>     ierr = MatSetSizes(A, 400000, PETSC_DECIDE,
> 1139905,1139905);CHKERRQ(ierr);
>
>    }
>
> else {
>
>     ierr = MatSetSizes(A, 739905, PETSC_DECIDE,
> 1139905,1139905);CHKERRQ(ierr);
>
> }
>
> MatMult (A,x,y);
>
>
> /************************************/
>
> Error message:
>
> [1]PETSC ERROR: [0]PETSC ERROR: No support for this operation for this
> object type
>
> Cannot change/reset row sizes to 400000 local 1139906 global after
> previously setting them to 400000 local 1139905 global
>
> [1]PETSC ERROR: [0]PETSC ERROR: Cannot change/reset row sizes to 739905
> local 1139906 global after previously setting them to 739905 local 1139905
> global
>
>
>
>
>
> On 23 March 2016 at 15:46, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>   The baij and sbaij MatLoad() do automatically pad the matrix with
>> rows/columns of the identity to make it divisible by the block size. This
>> is why you are seeing what you are seeing.
>>
>>    Barry
>>
>> Is this a good idea? Maybe not
>>
>>
>> > On Mar 22, 2016, at 8:59 PM, Steena Monteiro <steena.hpc at gmail.com>
>> wrote:
>> >
>> >
>> >
>> >
>> > So, are you saying that
>> >
>> >   1) You have a matrix with odd total dimension
>> >
>> >   2) You set the block size of the initial matrix to 2
>> >
>> >   3) You load the matrix
>> >
>> > and there is no error? Can you make a simple example with a matrix of
>> size 5?
>> > I can put in the relevant error checking.
>> >
>> >
>> >
>> > Hi Matt,
>> >
>> > Thank you for the suggestion.
>> >
>> > I used cage3, a 5x5 matrix from the UFL collection, converted it to
>> binary, and tested the code for block sizes 1 through 7. I added  printfs
>> inside all the MatMult_SeqBAIJ_*s in baij2.c  and also logged some counts
>> (blocked rows and blocks). The counts make sense if the matrix is being
>> padded somewhere to accommodate for block sizes that are not completely
>> divisible by matrix dimensions.
>> >
>> > surface86 at monteiro:./rvector-petsctrain-seqbaij -fin cage3.dat
>> -matload_block_size 2
>> > Inside SeqBAIJ_2
>> >
>> > surface86 at monteiro:./rvector-petsctrain-seqbaij -fin cage3.dat
>> -matload_block_size 3
>> > Inside MatMult_SeqBAIJ_3
>> > ...
>> > ...
>> >
>> > surface86 at monteiro:./rvector-petsctrain-seqbaij -fin cage3.dat
>> -matload_block_size 7
>> > Inside MatMult_SeqBAIJ_7
>> >
>> > Table for different block sizes listing corresponding number of blocked
>> rows and number of blocks inside the rows for cage3.
>> >
>> >
>> > Block size    No. of blocked rows     No. of nnz blocks in each blocked
>> row.
>> > 1     5       5,3,3,4,4
>> > 2     3       3,3,3
>> > 3     2       2,2
>> > 4     2       2,2
>> > 5     1       1
>> > 6     1       1
>> > 7     1       1
>> >
>> > I am attaching cage3.dat and cage3.mtx.
>> >
>> > Thanks,
>> > Steena
>> >
>> >
>> >
>> >
>> >
>> >
>> >   ierr = MatCreateVecs(A, &x, &y);CHKERRQ(ierr);
>> >
>> >
>> >   ierr =  VecSetRandom(x,NULL); CHKERRQ(ierr);
>> >   ierr = VecSet(y,zero); CHKERRQ(ierr);
>> >   ierr = MatMult(A,x,y); CHKERRQ(ierr);
>> >
>> >
>> >   ierr = PetscViewerDestroy(&fd);CHKERRQ(ierr);
>> >   ierr = MatDestroy(&A);CHKERRQ(ierr);
>> >   ierr = VecDestroy(&x);CHKERRQ(ierr);
>> >   ierr = VecDestroy(&y);CHKERRQ(ierr);
>> >
>> > Thanks,
>> > Steena
>> >
>> >
>> > On 15 March 2016 at 09:15, Matthew Knepley <knepley at gmail.com> wrote:
>> > On Tue, Mar 15, 2016 at 11:04 AM, Steena Monteiro <steena.hpc at gmail.com>
>> wrote:
>> > I pass a binary, matrix data file at the command line and load it into
>> the matrix:
>> >
>> > PetscInitialize(&argc,&args,(char*)0,help);
>> > ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
>> >
>> > /* converted mtx to dat file*/
>> > ierr = PetscOptionsGetString(NULL,"-f",file,PETSC_MAX_PATH_LEN,&flg);
>> > CHKERRQ(ierr);
>> >
>> > if (!flg) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_USER,"specify matrix dat
>> file with -f");
>> >
>> >  /* Load matrices */
>> > ierr =
>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr);
>> > ierr =
>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr);
>> > ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);
>> > ierr = MatSetFromOptions(A);CHKERRQ(ierr);
>> >
>> > Nothing above loads a matrix. Do you also call MatLoad()?
>> >
>> >   Matt
>> >
>> > Thanks,
>> > Steena
>> >
>> > On 15 March 2016 at 08:58, Matthew Knepley <knepley at gmail.com> wrote:
>> > On Tue, Mar 15, 2016 at 10:54 AM, Steena Monteiro <steena.hpc at gmail.com>
>> wrote:
>> > Thank you, Dave.
>> >
>> > Matt: I understand the inconsistency but MatMult with non divisible
>> block sizes (here, 2) does not throw any errors and fail, when MatSetSize
>> is commented out. Implying that 1139905 global size does work with block
>> size 2.
>> >
>> > If you comment out MatSetSize(), how does it know what size the Mat is?
>> >
>> >    Matt
>> >
>> > On 15 March 2016 at 00:12, Dave May <dave.mayhem23 at gmail.com> wrote:
>> >
>> > On 15 March 2016 at 04:46, Matthew Knepley <knepley at gmail.com> wrote:
>> > On Mon, Mar 14, 2016 at 10:05 PM, Steena Monteiro <steena.hpc at gmail.com>
>> wrote:
>> > Hello,
>> >
>> > I am having difficulty getting MatSetSize to work prior to using
>> MatMult.
>> >
>> > For matrix A with rows=cols=1,139,905 and block size = 2,
>> >
>> > It is inconsistent to have a row/col size that is not divisible by the
>> block size.
>> >
>> >
>> > To be honest, I don't think the error message being thrown clearly
>> indicates what the actual problem is (hence the email from Steena). What
>> about
>> >
>> > "Cannot change/reset row sizes to 400000 local 1139906 global after
>> previously setting them to 400000 local 1139905 global. Local and global
>> sizes must be divisible by the block size"
>> >
>> >
>> >   Matt
>> >
>> > rank 0 gets 400000 rows and rank 1 739905 rows,  like so:
>> >
>> > /*Matrix setup*/
>> >
>> > ierr=PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);
>> > ierr = MatCreate(PETSC_COMM_WORLD,&A);
>> > ierr = MatSetFromOptions(A);
>> > ierr = MatSetType(A,MATBAIJ);
>> > ierr = MatSetBlockSize(A,2);
>> >
>> > /*Unequal row assignment*/
>> >
>> >  if (!rank) {
>> >     ierr = MatSetSizes(A, 400000, PETSC_DECIDE,
>> 1139905,1139905);CHKERRQ(ierr);
>> >    }
>> > else {
>> >     ierr = MatSetSizes(A, 739905, PETSC_DECIDE,
>> 1139905,1139905);CHKERRQ(ierr);
>> > }
>> >
>> > MatMult (A,x,y);
>> >
>> > /************************************/
>> >
>> > Error message:
>> >
>> > 1]PETSC ERROR: [0]PETSC ERROR: No support for this operation for this
>> object type
>> > Cannot change/reset row sizes to 400000 local 1139906 global after
>> previously setting them to 400000 local 1139905 global
>> >
>> > [1]PETSC ERROR: [0]PETSC ERROR: Cannot change/reset row sizes to 739905
>> local 1139906 global after previously setting them to 739905 local 1139905
>> global
>> >
>> > -Without messing with row assignment,  MatMult works fine on this
>> matrix for block size = 2, presumably because an extra padded row is
>> automatically added to facilitate blocking.
>> >
>> > -The above code snippet works well for block size = 1.
>> >
>> > Is it possible to do unequal row distribution while using blocking?
>> >
>> > Thank you for any advice.
>> >
>> > -Steena
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> > <cage3.dat><cage3.mtx>
>>
>>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160323/67ab90f2/attachment.html>


More information about the petsc-users mailing list