[mpich-discuss] How to handle complex variables
Antonin@LPS
bourgeois at lps.u-psud.fr
Tue Nov 9 03:48:58 CST 2010
Hi Jayesh,
Thanks for your reply.
I'm using MPICH2-1.2.1p1 (not the latest...) on a Unix 64 with ifort
11.1. I get no error message when I use the basic compilation "mpif90
MPItest.f90" with no further options.
Here I paste the output of the code, using complex(8) and a ring of 2
processes:
BCAST with count=1*MPI_COMPLEX: (1.50000000000000,0.500000000000000)
Proc. 1 received (1.50000000000000,0.000000000000000E+000)
BCAST with count=2*MPI_COMPLEX: (1.50000000000000,0.500000000000000)
Proc. 1 received (1.50000000000000,0.500000000000000)
BCAST with count=1*MPI_DOUBLE_COMPLEX:
(1.50000000000000,0.500000000000000)
Proc. 1 received (1.50000000000000,0.500000000000000)
REDUCE with count=1*MPI_DOUBLE_COMPLEX
Proc. 0 has (1.50000000000000,0.500000000000000)
Proc. 1 has (2.50000000000000,1.50000000000000)
... and here the program hangs...
You can see that, when using count=1 and datatype=MPI_COMPLEX on a
complex(8) variable, only the real part is broadcasted (that is, the
first data in memory that has the length of a double. The whole
complex(8) variable, which is twice larger than a double, is only
broadcasted when using count=2 with datatype=MPI_COMPLEX, or count=1
with datatype=MPI_DOUBLE_COMPLEX.
Declaring the variable as "double complex" instead of "complex(8)" is
exactly equivalent.
If I declare a "complex(16)", which is twice larger than a complex(8),
nothing is broadcasted when using count=1*MPI_COMPLEX, and only the real
part is broadcasted when using count=2*MPI_COMPLEX or
count=1*MPI_DOUBLE_COMPLEX.
It looks like something is wrong with the "size" of the complex
variables, doesn't it?
I hope these indications are helpful...
Regards,
Antonin
--
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20101109/457e1a04/attachment.htm>
More information about the mpich-discuss
mailing list