collective write with 1 dimension being global

Mark Cheeseman mark.cheeseman at kaust.edu.sa
Sun Mar 6 04:47:27 CST 2011


Hello,

I have a 4D variable inside a NetCDF file that I wish to distribute over a
number of MPI tasks.  The variable will be decomposed over the first 3
dimensions but not the fouth (i.e. the fourth dimension is kept global for
all MPI tasks). In other words:

              GLOBAL_FIELD[nx,ny,nz,nv]  ==>
LOCAL_FIELD[nx_local,ny_local,nz_local,nv]

I am trying to achieve via a nfmpi_get_vara_double_all call but the data
keeps getting corrupted.  I am sure that my offsets and local domain sizes
are correct.  If I modify my code to read only a single 3D slice (i.e. along
1 point in the fourth dimension), the code and input data are correct.

Can parallel-netcdf handle a local dimension being equal to a global
dimension?  Or should I be using another call?

Thanks,
Mark

-- 
Mark Patrick Cheeseman

Computational Scientist
KSL (KAUST Supercomputing Laboratory)
Building 1, Office #126
King Abdullah University of Science & Technology
Thuwal 23955-6900
Kingdom of Saudi Arabia

EMAIL   : mark.cheeseman at kaust.edu.sa
PHONE : +966   (2) 808 0221 (office)
               +966 (54) 470 1082 (mobile)
SKYPE : mark.patrick.cheeseman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/parallel-netcdf/attachments/20110306/0eabe22f/attachment.htm>


More information about the parallel-netcdf mailing list