[petsc-users] petsc4py help with parallel execution

Ivan Voznyuk ivan.voznyuk.work at gmail.com
Thu Nov 15 07:07:05 CST 2018


Hi Matthew,
Thanks for your reply!

Let me precise what I mean by defining few questions:

1. In order to obtain a parallel execution of simple_code.py, do I need to
go with mpiexec python3 simple_code.py, or I can just launch python3
simple_code.py?
2. This simple_code.py consists of 2 parts: a) preparation of matrix b)
solving the system of linear equations with PETSc. If I launch mpirun (or
mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically
obtain 8 matrices and 8 systems to solve. However, I need to prepare only
one matrix, but launch this code in parallel on 8 processors.
In fact, here attached you will find a similar code (scipy_code.py) with
only one difference: the system of linear equations is solved with scipy.
So when I solve it, I can clearly see that the solution is obtained in a
parallel way. However, I do not use the command mpirun (or mpiexec). I just
go with python3 scipy_code.py.
In this case, the first part (creation of the sparse matrix) is not
parallel, whereas the solution of system is found in a parallel way.
So my question is, Do you think that it s possible to have the same
behavior with PETSC? And what do I need for this?

I am asking this because for my colleague it worked! It means that he
launches the simple_code.py on his computer using the command python3
simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a
parallel execution of the same code.

Thanks for your help!
Ivan


On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Dear PETSC community,
>>
>> I have a question regarding the parallel execution of petsc4py.
>>
>> I have a simple code (here attached simple_code.py) which solves a system
>> of linear equations Ax=b using petsc4py. To execute it, I use the command
>> python3 simple_code.py which yields a sequential performance. With a
>> colleague of my, we launched this code on his computer, and this time the
>> execution was in parallel. Although, he used the same command python3
>> simple_code.py (without mpirun, neither mpiexec).
>>
>> I am not sure what you mean. To run MPI programs in parallel, you need a
> launcher like mpiexec or mpirun. There are Python programs (like nemesis)
> that use the launcher API directly (called PMI), but that is not part of
> petsc4py.
>
>   Thanks,
>
>      Matt
>
>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc
>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv
>>
>> In order to parallelize it, I have already tried:
>> - use 2 different PCs
>> - use Ubuntu 16.04, 18.04
>> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug,
>> etc)
>> - ofc use different configurations (my present config can be found in
>> make.log that I attached here)
>> - mpi from mpich, openmpi
>>
>> Nothing worked.
>>
>> Do you have any ideas?
>>
>> Thanks and have a good day,
>> Ivan
>>
>> --
>> Ivan VOZNYUK
>> PhD in Computational Electromagnetics
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


-- 
Ivan VOZNYUK
PhD in Computational Electromagnetics
+33 (0)6.95.87.04.55
My webpage <https://ivanvoznyukwork.wixsite.com/webpage>
My LinkedIn <http://linkedin.com/in/ivan-voznyuk-b869b8106>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181115/723836cd/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: scipy_code.py
Type: text/x-python
Size: 601 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181115/723836cd/attachment-0001.py>


More information about the petsc-users mailing list