[petsc-users] MatSetSizes for MATMPIBAIJ

NovA av.nova at gmail.com
Fri Feb 5 18:17:28 CST 2010


2010/2/6 Barry Smith <bsmith at mcs.anl.gov>:
>
> On Feb 5, 2010, at 5:28 PM, NovA wrote:
>
>> Thanks for the quick response!
>>
>> 2010/2/6 Matthew Knepley <knepley at gmail.com>:
>>>
>>> 1) It will distribute block rows, so you will get 8 and 4 as you want
>>
>> How can it distribute in such a way, if it doesn't know block size yet?
>> Let's continue the example:
>>  MatCreate(comm, &A);
>>  MatSetSizes(A, PETSC_DECIDE,12, 12,12);
>
>                                                                ^^^^^^
>                            This is wrong! Look at the manual page for
> MatCreateMPIBAIJ() for information on the meaning of this.

Sorry, but I can't find any additional info on this at
http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatCreateMPIBAIJ.html
and around. I could just expect that those sizes mean number of blocks
not individual values, but can't find any confirmation...


>>  MatSetType(A, MATMPIBAIJ);
>>  MatGetOwnershipRange(A, &rowS, &rowE);
>>  Then loop local rows from rowS/4 to rowE/4 to preallocate storage
>> using MatMPIBAIJSetPreallocation(A,4,...)
>
>   You cannot do it this way. You cannot call MatGetOwnershipRange() before
> setting the preallocation.

Oh, ok. This applies to BAIJ, right? For AIJ this works, so I thought
it should for BAIJ either.

>
>   You can either,
>
> 1) figure out yourself the local sizes you want to use, then just call
> MatCreateMPIBAIJ() with all the information (or call MatCreate(),
> MatSetType(), MatSetSizes(), MatMPIBAIJSetPreallocation()) or
> 2) Use PetscMap (see manual page for PetscMapInitialize) to see how PETSc
> would decompose the rows and then use that information to do 1).

Thanks for the tip. I'll try to sort it out.

Best wishes,
    Andrey


>
>   Barry
>
> In petsc-dev PetscMap has been changed to PetscLayout, see
> PetscLayoutCreate().
>>
>> What will be the rowS and rowE here? I think they can be not divisible
>> by 4 (block size)...
>>
>>
>>>
>>> 2) Warning: This is an incredibly small problem. Using 2 processors
>>>    might not show any speedup at all.
>>
>> Sure! It's just an example. I suppose it'll be much slower on two
>> processors actually.
>>
>>
>> Regards!
>>  Andrey
>>
>>>
>>>  Matt
>>>
>>> On Fri, Feb 5, 2010 at 5:03 PM, NovA <av.nova at gmail.com> wrote:
>>>>
>>>> Hi everybody!
>>>>
>>>> I'm looking for the best way to distribute MPIBAIJ matrix among
>>>> several processors.
>>>>
>>>> For example, I have square matrix 3x3 of blocks 4x4 each (i.e. 12x12
>>>> values) and need to distribute it among 2 processors. The generic way
>>>> of creating such a matrix would be:
>>>>  Mat A;
>>>>  MatCreate(comm, &A);
>>>>  MatSetSizes(A, loc_rows,12, 12,12);
>>>>  MatSetType(A, MATMPIBAIJ);
>>>>
>>>> What is the optimal value for loc_rows here?
>>>>
>>>> Can I use PETSC_DECIDE for it? I suppose this will lead to division by
>>>> 6 rows per processor, which is not consistent with block size. How
>>>> MatMPIBAIJSetPreallocation(A,4,...) will deal with it?
>>>>
>>>> Most likely, I have to manually set loc_rows to 8 and 4 according to
>>>> the processor rank. But probably I miss some more clean way, when
>>>> PETSc really decides. :)
>>>>
>>>> Thanks in advance for any comments.
>>>>
>>>> Regards!
>>>>  Andrey
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments
>>> is infinitely more interesting than any results to which their
>>> experiments
>>> lead.
>>> -- Norbert Wiener
>>>
>
>


More information about the petsc-users mailing list