[petsc-users] petsc4py help with parallel execution
Matthew Knepley
knepley at gmail.com
Thu Nov 15 04:54:30 CST 2018
On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Dear PETSC community,
>
> I have a question regarding the parallel execution of petsc4py.
>
> I have a simple code (here attached simple_code.py) which solves a system
> of linear equations Ax=b using petsc4py. To execute it, I use the command
> python3 simple_code.py which yields a sequential performance. With a
> colleague of my, we launched this code on his computer, and this time the
> execution was in parallel. Although, he used the same command python3
> simple_code.py (without mpirun, neither mpiexec).
>
> I am not sure what you mean. To run MPI programs in parallel, you need a
launcher like mpiexec or mpirun. There are Python programs (like nemesis)
that use the launcher API directly (called PMI), but that is not part of
petsc4py.
Thanks,
Matt
> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc 3.10.2,
> PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv
>
> In order to parallelize it, I have already tried:
> - use 2 different PCs
> - use Ubuntu 16.04, 18.04
> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, etc)
> - ofc use different configurations (my present config can be found in
> make.log that I attached here)
> - mpi from mpich, openmpi
>
> Nothing worked.
>
> Do you have any ideas?
>
> Thanks and have a good day,
> Ivan
>
> --
> Ivan VOZNYUK
> PhD in Computational Electromagnetics
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181115/3c1dadef/attachment.html>
More information about the petsc-users
mailing list