[petsc-users] petsc4py and MPI.COMM_SELF.Spawn
Rodrigo Felicio
Rodrigo.Felicio at iongeo.com
Thu Mar 2 09:09:51 CST 2017
Thanks, Matt,
I see your point. My problem is that I need to have many “masters” and each of these needs to have their own and distinct group of “children” processes to run some linear algebra in parallel. Being new to MPI, I thought that the Spawn approach would help with that, because I already had a PETSc program that could solve the associated Least-squares problem. I thought that I only needed to adapt that code so that instead of calling it at the prompt, I could integrate it to a MPI code using MPI.SPAWN function. Since for some reason the MPI.SPAWN is not working correctly for me I am now seeking to solve my problem by splitting the COMM_WORLD instead, which I believe is more in line with your suggestion.
Kind regards
Rodrigo
From: Matthew Knepley [mailto:knepley at gmail.com]
Sent: Thursday, March 02, 2017 8:11 AM
To: Rodrigo Felicio
Cc: Barry Smith; petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] petsc4py and MPI.COMM_SELF.Spawn
On Wed, Mar 1, 2017 at 2:51 PM, Rodrigo Felicio <Rodrigo.Felicio at iongeo.com<mailto:Rodrigo.Felicio at iongeo.com>> wrote:
Sorry, I spoke too soon...
Reversing the order between mpi4py and petsc4py imports does work *only* on the master code side, but not on the child process code side. In that case, the program hangs after the children processes are fired up and fails the same way as reported before...
Again, I have no idea what you mean here. I do not think you can separately run the two codes. How will the
PMI manager know that these two separate processes should be in the same communicator (WORLD). It
makes no sense to me. In MPI, you need to write the master and child in the same code, with a switch for the
master rank.
Matt
cheers
Rodrigo
________________________________________
From: petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at mcs.anl.gov> [petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at mcs.anl.gov>] on behalf of Rodrigo Felicio [Rodrigo.Felicio at iongeo.com<mailto:Rodrigo.Felicio at iongeo.com>]
Sent: Wednesday, March 01, 2017 2:31 PM
To: Barry Smith
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] petsc4py and MPI.COMM_SELF.Spawn
I thought I had tried that as well with no success before, but this time it worked, despite some persistent error msgs related to PMI_finalize:
time mpirun -n 1 python dyn_mem_ex.py
proc 2 of 4 proc 3 of 4
proc 1 of 4
proc 0 of 4
proc 1 of 4, Adim=[10]
proc 2 of 4, Adim=[10]
proc 0 of 4, Adim=[10]
proc 3 of 4, Adim=[10]
Adata = [ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]
Adata = [ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]Adata = [ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]
Adata = [ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]
3.14160098692
2.65258238441e-06
[cli_0]: write_line error; fd=12 buf=:cmd=finalize
:
system msg for write_line failure : Bad file descriptor
Fatal error in MPI_Finalize: Other MPI error, error stack:
MPI_Finalize(281).....: MPI_Finalize failed
MPI_Finalize(209).....:
MPID_Finalize(133)....:
MPIDI_PG_Finalize(106): PMI_Finalize failed, error -1
real 0m0.586s
user 0m0.536s
sys 0m0.613s
Best,
Rodrigo
________________________________
This email and any files transmitted with it are confidential and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the original recipient or the person responsible for delivering the email to the intended recipient, be advised that you have received this email in error, and that any use, dissemination, forwarding, printing, or copying of this email is strictly prohibited. If you received this email in error, please immediately notify the sender and delete the original.
________________________________
This email and any files transmitted with it are confidential and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the original recipient or the person responsible for delivering the email to the intended recipient, be advised that you have received this email in error, and that any use, dissemination, forwarding, printing, or copying of this email is strictly prohibited. If you received this email in error, please immediately notify the sender and delete the original.
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
________________________________
This email and any files transmitted with it are confidential and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the original recipient or the person responsible for delivering the email to the intended recipient, be advised that you have received this email in error, and that any use, dissemination, forwarding, printing, or copying of this email is strictly prohibited. If you received this email in error, please immediately notify the sender and delete the original.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170302/1641a424/attachment.html>
More information about the petsc-users
mailing list