[petsc-users] petsc4py and MPI.COMM_SELF.Spawn

Matthew Knepley knepley at gmail.com
Thu Mar 2 08:10:48 CST 2017


On Wed, Mar 1, 2017 at 2:51 PM, Rodrigo Felicio <Rodrigo.Felicio at iongeo.com>
wrote:

> Sorry, I spoke too soon...
> Reversing the order between mpi4py and petsc4py imports does work *only*
> on the master code side, but not on the child process code side. In that
> case, the program hangs after the children processes are fired up and fails
> the same way as reported before...
>

Again, I have no idea what you mean here. I do not think you can separately
run the two codes. How will the
PMI manager know that these two separate processes should be in the same
communicator (WORLD). It
makes no sense to me. In MPI, you need to write the master and child in the
same code, with a switch for the
master rank.

   Matt


> cheers
> Rodrigo
> ________________________________________
> From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov]
> on behalf of Rodrigo Felicio [Rodrigo.Felicio at iongeo.com]
> Sent: Wednesday, March 01, 2017 2:31 PM
> To: Barry Smith
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] petsc4py and MPI.COMM_SELF.Spawn
>
> I thought I had tried that as well with no success before, but this time
> it worked, despite some persistent error msgs related to PMI_finalize:
>
> time mpirun -n 1 python dyn_mem_ex.py
> proc 2 of 4 proc 3 of 4
> proc 1 of 4
> proc 0 of 4
>
> proc 1 of 4, Adim=[10]
> proc 2 of 4, Adim=[10]
> proc 0 of 4, Adim=[10]
> proc 3 of 4, Adim=[10]
> Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]
> Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]Adata = [ 0.  1.  2.  3.
> 4.  5.  6.  7.  8.  9.]
>
> Adata = [ 0.  1.  2.  3.  4.  5.  6.  7.  8.  9.]
> 3.14160098692
> 2.65258238441e-06
> [cli_0]: write_line error; fd=12 buf=:cmd=finalize
> :
> system msg for write_line failure : Bad file descriptor
> Fatal error in MPI_Finalize: Other MPI error, error stack:
> MPI_Finalize(281).....: MPI_Finalize failed
> MPI_Finalize(209).....:
> MPID_Finalize(133)....:
> MPIDI_PG_Finalize(106): PMI_Finalize failed, error -1
>
> real    0m0.586s
> user    0m0.536s
> sys     0m0.613s
>
> Best,
> Rodrigo
>
> ________________________________
>
>
> This email and any files transmitted with it are confidential and are
> intended solely for the use of the individual or entity to whom they are
> addressed. If you are not the original recipient or the person responsible
> for delivering the email to the intended recipient, be advised that you
> have received this email in error, and that any use, dissemination,
> forwarding, printing, or copying of this email is strictly prohibited. If
> you received this email in error, please immediately notify the sender and
> delete the original.
>
>
> ________________________________
>
>
> This email and any files transmitted with it are confidential and are
> intended solely for the use of the individual or entity to whom they are
> addressed. If you are not the original recipient or the person responsible
> for delivering the email to the intended recipient, be advised that you
> have received this email in error, and that any use, dissemination,
> forwarding, printing, or copying of this email is strictly prohibited. If
> you received this email in error, please immediately notify the sender and
> delete the original.
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170302/93b3dc4c/attachment-0001.html>


More information about the petsc-users mailing list