[petsc-users] petsc4py and MPI.COMM_SELF.Spawn

Rodrigo Felicio Rodrigo.Felicio at iongeo.com
Wed Mar 1 11:40:36 CST 2017


Thanks, Barry and Matt, for your prompt responses. I really appreciated them.

Sorry I forgot to mention that I am using petsc 3.6.1, petsc4py 3.6.0 and mpi4py 2.0.0. (and also tested with mpi4py 1.2.2)

from running env I get:

PYTHONPATH=/home/XXXXX/Enthought/Canopy_64bit/User/lib/python2.7/site-packages:/home/XXXXX/dev/myPy:/home/XXXXX/ipnotebooks

and also

PETSC_DIR=/home/XXXXX/mylocal/petsc-3.6.1
PETSC_ARCH=arch-linux2-c
I_MPI_ROOT=/apps/tools/centos54-x86_64-intel14//impi/4.1.3.04


Funny thing is that  only inside a python session I see the directory where both petsc4py and mpi4py are installed in the python path, i.e.,

only after
import sys
sys.path

I see  '/home/XXXX/.local/lib/python2.7/site-packages'


Anyway, after running
time mpirun -n 1 python ./dyn_mem_ex.py

I get the error msg  (once I killed the process running dyn_mem_ex.py. Mind you I edited out the the IP of the machine with XXX.XX.XX.XX)
=====================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   EXIT CODE: 15
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
Fatal error in PMPI_Init_thread: Invalid port, error stack:
MPIR_Init_thread(674).....:
MPID_Init(320)............: spawned process group was unable to connect back to the parent on port <tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$>
MPID_Comm_connect(206)....:
MPIDI_Comm_connect(579)...: Named port tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$ does not exist
MPIDI_Comm_connect(415)...:
dequeue_and_set_error(628): Communication error with rank 0
Fatal error in PMPI_Init_thread: Invalid port, error stack:
MPIR_Init_thread(674)..:
MPID_Init(320).........: spawned process group was unable to connect back to the parent on port <tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$>
MPID_Comm_connect(206).:
MPIDI_Comm_connect(432): Named port tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$ does not exist
Fatal error in PMPI_Init_thread: Invalid port, error stack:
MPIR_Init_thread(674)..:
MPID_Init(320).........: spawned process group was unable to connect back to the parent on port <tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$>
MPID_Comm_connect(206).:
MPIDI_Comm_connect(432): Named port tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$ does not exist
Fatal error in PMPI_Init_thread: Invalid port, error stack:
MPIR_Init_thread(674)..:
MPID_Init(320).........: spawned process group was unable to connect back to the parent on port <tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$>
MPID_Comm_connect(206).:
MPIDI_Comm_connect(432): Named port tag#0$description#v1n29$port#51319$ifname#XXX.XX.XX.XX$ does not exist
APPLICATION TERMINATED WITH THE EXIT STRING: Terminated (signal 15)

real    0m9.987s
user    0m36.714s
sys     0m11.541s

if I comment out the lines that import PETSc and initialize petsc4py on the code for the cpi.py (codes attached below)

import pestc4py
#petsc4py.init(sys.argv)
#from petsc4py import PETSc
from mpi4py import MPI

Then it runs without problems and the output is

time mpirun -n 1 python ./dyn_mem_ex.py
proc 2 of 4
proc 3 of 4 proc 1 of 4

proc 0 of 4
proc 0 of 4, Adim=[10]
proc 1 of 4, Adim=[10]
proc 2 of 4, Adim=[10]
proc 3 of 4, Adim=[10]
Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]

Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]

3.14160098692
2.65258238441e-06

real    0m0.535s
user    0m0.431s
sys     0m0.633s



the codes that I used in this example,  have just minor modifications compared to old examples from petsc4py. For reference, I am also attaching them

#---------------------------------
# dyn_mem_ex.py
# ------------------------------------
import numpy
import sys
import petsc4py
#petsc4py.init(sys.argv)
#from petsc4py import PETSc
from mpi4py import MPI
mypath = '/home/XXX/study/mpi4py/'
comm = MPI.COMM_SELF.Spawn(sys.executable,
                           args=[mypath + 'cpi.py'],
                           maxprocs=4)

N = numpy.array(100, 'i')
Adata = numpy.array(numpy.arange(10), dtype='f')
Adim = numpy.array(Adata.shape[0], dtype='i')


comm.Bcast([N, MPI.INT], root=MPI.ROOT)
comm.Bcast([Adim, MPI.INT], root=MPI.ROOT)
comm.Bcast([Adata, MPI.FLOAT], root=MPI.ROOT)
PI = numpy.array(0.0, 'd')
comm.Reduce(None, [PI, MPI.DOUBLE],
            op=MPI.SUM, root=MPI.ROOT)
print(PI)
print(PI/numpy.pi - 1.0)

comm.Disconnect()


#---------------------------------
# cpi.py
# ------------------------------------

import numpy
import sys, petsc4py
#petsc4py.init(sys.argv)
#from petsc4py import PETSc
from mpi4py import MPI

parent = MPI.Comm.Get_parent()
size = parent.Get_size()
rank = parent.Get_rank()

print("proc {} of {} ".format(rank, size))
N = numpy.array(0, dtype='i')
Adim = numpy.zeros(1, dtype='i')

parent.Bcast([N, MPI.INT], root=0)
parent.Bcast([Adim, MPI.INT], root=0)

Adata = numpy.zeros(Adim[0], dtype='f')
parent.Bcast([Adata, MPI.FLOAT],root=0)
print("proc {} of {}, Adim={}".format(rank, size, Adim))
print("Adata = {}".format(Adata))
h = 1.0 / N; s = 0.0
for i in range(rank, N, size):
    x = h * (i + 0.5)
    #print(rank,size,Adim.shape)
    s += 4.0 / (1.0 + x**2)
PI = numpy.array(s * h, dtype='d')
parent.Reduce([PI, MPI.DOUBLE], None,
            op=MPI.SUM, root=0)

parent.Disconnect()


kind regards,
Rodrigo


________________________________


This email and any files transmitted with it are confidential and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the original recipient or the person responsible for delivering the email to the intended recipient, be advised that you have received this email in error, and that any use, dissemination, forwarding, printing, or copying of this email is strictly prohibited. If you received this email in error, please immediately notify the sender and delete the original.



More information about the petsc-users mailing list