[petsc-dev] [petsc-maint #101028] Re: BDDC code

Stefano Zampini stefano.zampini at gmail.com
Tue Jan 10 15:17:36 CST 2012


Today I solved the bug for more than three levels with MULTILEVEL BDDC. If
I have time, tomorrow I will adapt the code to deal with local matrices
having a NearNullSpace object attached. Then the next step will be remove
explicit calls to METIS and use MatPartitioning routines. Jed: can you give
me some quick hints on how they behave? In particular, I wish to know if I
can assemble the adjacency on N procs and then partition it in M procs
(with M<N).

2012/1/9 Stefano Zampini <stefano.zampini at gmail.com>

>
>
> 2012/1/9 Jed Brown <jedbrown at mcs.anl.gov>
>
>> Cc'ing petsc-dev to get other opinions.
>>
>>
> Great.
>
>
>> On Mon, Jan 9, 2012 at 06:56, Stefano Zampini <stefano.zampini at gmail.com>wrote:
>>
>>> Because of the way I built the code, different coarse problem types set
>>> communicators and operators in a different way. Specifically,
>>>
>>> - MULTILEVEL_BDDC: mat_type=MATIS, coarse_comm=comm of parent,
>>> coarse_pc=PCBDDC.
>>> - PARALLEL_BDDC: mat_type=MATMPIAIJ, coarse_comm=comm of parent,
>>> coarse_pc=PCREDUNDANT.
>>>
>>
>> The only distinction here is the matrix type.
>>
>>
>>> - SEQUENTIAL_BDDC (only rank 0 creates it): mat_type=MATSEQAIJ,
>>> coarse_comm=PETSC_COMM_SELF, coarse_pc=PCLU.
>>>
>>
>> This should be handled through a mechanism like PCREDUNDANT so that the
>> caller does not need to know. This "reduce to subcommunicator, broadcast
>> the result" would be simple to add to PCREDUNDANT.
>>
>>
>>> - REPLICATED_BDDC (all ranks creates the same sequential ksp):
>>> mat_type=MATSEQAIJ, coarse_comm=PETSC_COMM_SELF, coarse_pc=PCLU.
>>>
>>
>> What does this have over using PCREDUNDANT as in case 2?
>>
>
>>
>>>
>>>
>>>>
>>>>
>>>>>  indeed a possible user code
>>>>>
>>>>> PCBDDCGetCoarseKSP(pcbddc,&coarse_ksp)
>>>>> KSPGetPC(coarse_ksp,&coarse_pc)
>>>>> PCSetType(coarse_pc,PCBDDC) (or something else)
>>>>> PCSetup(pcbddc)
>>>>>
>>>>> breaks at the second line since pcbddc->coarse_ksp doesn't exist
>>>>> before calling PCSetup.
>>>>>
>>>>
>>>> You can make PCBDDCGetCoarseKSP() return a KSP that has not had its
>>>> type set yet.
>>>>
>>>>
>>>
>>> By creating the KSP object internally to the function call? In this case
>>> the problem remains since the user should need to know a priori the type of
>>> communicator associated with the KSP. Can we change the communicator after
>>> we created the KSP object?
>>>
>>
>> No, but I think that the coarse communicator should probably always be
>> the same as for the whole problem, and then use PCREDUNDANT or similar if
>> we want to reduce to a subcomm.
>>
>>
>
> Except for MULTILEVEL_BDDC, all other approaches solve exactly the coarse
> problem thus there are no differences in theoretical results. In practice,
> they are there for performances optimizations. REPLICATED_BDDC is performed
> directly using allgatherv operations and it is an heritage from my previous
> codes. I can drop it. If you can manage the SEQUENTIAL_BDDC case directly
> in PCREDUNDANT, then we will be half-way. Enums on coarse communications
> type will be eliminated too as communications can be all made using
> VecScatters.
>
>
>
>> But this brings up an issue that also appears in GAMG, how to expose user
>> access to coarse KSPs even though they are created on adaptively sized
>> subcomms. If we configure strictly through the options database, then we
>> are okay, but what if we have to pass any information directly to the KSP?
>> We can set operators and call PCSetUp(), then pull out levels, but what
>> should be returned on ranks that are not participating?
>>
>
> Coarse KSP in my multilevel BDDC implementation is created on the same
> comm of parent; ranks not partecipating in coarse comm will create all
> objects needed my the levels even if they have a local dimension equal to
> zero.
>
>
>
>
> --
> Stefano
>



-- 
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120110/bf12ad14/attachment.html>


More information about the petsc-dev mailing list