[petsc4py] What should be the 'default' communicator?

Lisandro Dalcin dalcinl at gmail.com
Mon Aug 25 17:40:35 CDT 2008


On Mon, Aug 25, 2008 at 6:48 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>   This is fine for me, except I vote against the setter/getter. Just let the
> power user access the variable PETSC_COMM_DEFAULT directly.
>

Barry, sorry, I do not completely understand your comments. All my
concern about this is only relevant to petsc4py and not core PETSc.
With that in mind, I prefer to "hide" PETSC_COMM_DEFAULT from users,
and ask them to call the getter/setter routines. You know: in Python,
changing module-level globals is a bit unsafe: a user could do:

from petsc4py import PETSc
PETSc.COMM_DEFAULT = None # or watever

and then latter get a failure. Furthermore, the setter can be in
charge of all the relevant error checking: the comm have to actually
be an instance of 'PETSc.Comm' type, and its comunicator cannot be the
MPI_COMM_NULL.


>
> On Aug 25, 2008, at 4:43 PM, Lisandro Dalcin wrote:
>
>> On Mon, Aug 25, 2008 at 6:22 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>>>
>>> I agree that people will do this, I just don't agree that it should be
>>> the default.
>>
>> Would you agree with the following:
>>
>> At petsc4py initialization (and after calling PetscInitialize()), I
>> define PETSC_COMM_DEFAULT = PETSC_COMM_WORLD. All parallel PETSc
>> objects created through petsc4py use PETSC_COMM_DEFAULT if the
>> communicator is not explicitelly passed as an argument. Additionally,
>> I expose in petsc4py a getter/setter enabling users to change at ANY
>> TIME the default communicator to use. With this approach, the world
>> communicator will be default, unless changed by the (power) user.
>>
>>
>>
>>>> To be honest, I've never looked too much at paradigms like Condor. But
>>>> using them implies to learn yet another framework. Another anecdote: a
>>>> guy sent me a mail with questions about mpi4py for solving a
>>>> embarassingly parallel problems. I asked why he was trying to use such
>>>> a "heavy weight" approach. And then he answered he was tired of the
>>>> complications and performance of using a Grid-based approach, and that
>>>> 'mpiexec' a Python script with some coordinating MPI calls was far
>>>> easier to setup, extend, and maintain and had better overall running
>>>> times than submitting jobs to "The Grid".
>>>
>>> That is my experience with grid software as well. However, in the
>>> particular case
>>> of Condor, I disagree. It is fairly easy to setup and has great
>>> features like fault
>>> tolerance, automatic migration and balancing, that make it much more
>>> useful
>>> that just MPI.
>>>
>>>  Matt
>>>
>>>>> On Mon, Aug 25, 2008 at 10:54 AM, Lisandro Dalcin <dalcinl at gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> After working hard on mpi4py, this week I'll spend my time cleaning-up
>>>>>> and adding features to the new Cython-based petsc4py. Then, I'll be
>>>>>> asking questions to this list requesting for advise.
>>>>>>
>>>>>> In all calls that create new PETSc objects, I've decided to make the
>>>>>> 'comm' argument optional. If the communicator is not passed,
>>>>>> PETSC_COMM_WORLD is currently used. This is the approach PETSc uses in
>>>>>> some C++ calls implemented through PetscPolymorphicFunction().
>>>>>>
>>>>>> But now I believe that is wrong, and that PETSC_COMM_SELF should be
>>>>>> the default. Or perhaps even better, I should let users set the
>>>>>> default communicator used by petsc4py to create new (parallel)
>>>>>> objects.
>>>>>>
>>>>>> An anecdote: some time ago, a petsc4py user wrote a sequential code
>>>>>> and created objects without passing communicator arguments, next he
>>>>>> wanted to solve many of those problems in different worker processes
>>>>>> in a "ambarrasingly parallel" fashion and collect results at the
>>>>>> master process. Of course, he run into trouble. Then I asked him to
>>>>>> initialize PETSc in such a way that PETSC_COMM_WORLD was actually
>>>>>> PETSC_COMM_SELF (by setting the world comm before PetscInitalize()).
>>>>>> This mostly works, but has a problem: we have lost the actual
>>>>>> PETSC_COMM_WORLD, so we are not able to create a parallel object after
>>>>>> PetscInitialize().
>>>>>>
>>>>>> Any thoughts?
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Lisandro Dalcín
>>>>>> ---------------
>>>>>> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>>>>>> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>>>>>> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>>>>>> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>>>>>> Tel/Fax: +54-(0)342-451.1594
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which
>>>>> their experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Lisandro Dalcín
>>>> ---------------
>>>> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>>>> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>>>> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>>>> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>>>> Tel/Fax: +54-(0)342-451.1594
>>>>
>>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which
>>> their experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>
>>
>>
>> --
>> Lisandro Dalcín
>> ---------------
>> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>> Tel/Fax: +54-(0)342-451.1594
>>
>
>



-- 
Lisandro Dalcín
---------------
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
PTLC - Güemes 3450, (3000) Santa Fe, Argentina
Tel/Fax: +54-(0)342-451.1594




More information about the petsc-dev mailing list