[petsc-users] question about partitioning usage

Matthew Knepley knepley at gmail.com
Wed Jan 25 13:15:47 CST 2012


On Wed, Jan 25, 2012 at 1:08 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> PCASM with multiple subdomains per process.
>
> PCGAMG coarse levels.
>
GAMG makes sense, but ASM has to be activated with an option. I don't think
it
makes sense to do this by default.

  Matt


> On Jan 25, 2012 12:37 PM, "Matthew Knepley" <knepley at gmail.com> wrote:
>
>> On Wed, Jan 25, 2012 at 12:29 PM, Dominik Szczerba <dominik at itis.ethz.ch>wrote:
>>
>>> >> I recently realized that, independently of my explicitly partitioning
>>> >> the input mesh, Petsc also employs partitioning somewhere internally.
>>> >
>>> >
>>> > We don't unless you tell us to.
>>>
>>> Can you please expand? I am partitioning my input unstructured mesh to
>>> distribute dofs throughout the processes, and then I am setting up my
>>> MPI matricess "as usual", not intending any calls to parmetis or other
>>> partitioning functions... So where can I tell or not tell Petsc to use
>>> partitioning here or not?
>>>
>>
>> Unless you create a MatPartitioning object, we do not call ParMetis.
>>
>>    Matt
>>
>>
>>> Many thanks
>>> Dominik
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120125/2ead3c4b/attachment.htm>


More information about the petsc-users mailing list