[Nek5000-users] MPI IO out of usrchk routine
nek5000-users at lists.mcs.anl.gov
nek5000-users at lists.mcs.anl.gov
Sat Oct 17 07:57:13 CDT 2015
Dear Neks,
I am facing a problem with the MPI IO out of my usr supplied routine.
In this routine I interpolate the mesh on a uniform Cartesian grid. As an
output of this routine I have an array
fieldout(1:npts)
for each MPI task. Here is my MPI IO routine which calculates the offset and then packs
the byte of the lp MPI processes together in one file ux.*** where *** is ipic.
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
subroutine f_output(fieldout,ipic)
include 'SIZE'
include 'TOTAL'
integer npts
parameter (npts=1024)
real fieldout(1:npts)
integer*8 offset, disp
! integer iostatus(MPI_STATUS_SIZE)
integer ipic, f_off, out_fh, itask
integer*8 f_offset(0:np-1)
character(80) outname
!-----Calculation of the offset-----------------------------------------
do itask=0,np-1
f_offset(itask)=itask*npts*8
enddo
f_off=f_offset(nid)
!-----Conversion of integer*4 to integer*8
offset=int(f_off,8)
disp=0
WRITE(outname,'("ux.",I3.3)')ipic
!-----Open files using the MPI handles
call MPI_FILE_OPEN(nek_comm,TRIM(outname),
& MPI_MODE_CREATE+MPI_MODE_WRONLY,
& MPI_INFO_NULL,out_fh,ierr)
!-----Setting the correct view on the MPI handles (displacement=0)
call MPI_FILE_SET_VIEW(out_fh,disp,MPI_BYTE,MPI_BYTE,
& 'native',MPI_INFO_NULL,ierr)
!-----Write data using MPI handles
call MPI_FILE_WRITE_AT(out_fh,offset,fieldout,npts,MPI_BYTE,
& MPI_STATUS_IGNORE,ierr)
!-----Close files
call MPI_FILE_CLOSE(out_fh,ierr)
return
end
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
I got the following error message:
Abort(1) on node 15669 (rank 15669 in comm 1140850688): Fatal error in MPI_Info_dup: Invalid MPI_Info, error stack:
MPI_Info_dup(125): MPI_Info_dup(info=0x0, newinfo=0x1e7cfdb440) failed
MPI_Info_dup(66).: Invalid MPI_Info
Abort(1) on node 14577 (rank 14577 in comm 1140850688): Fatal error in MPI_Info_dup: Invalid MPI_Info, error stack:
MPI_Info_dup(125): MPI_Info_dup(info=0x0, newinfo=0x1e7cfdb440) failed
MPI_Info_dup(66).: Invalid MPI_Info
…. and so on.
I suspect that the communicator is not correctly set up.
I used nek_comm and also tried MPI_COMM_WORLD. The error message is the
same in both cases (see above).
Does anyone have an idea how to solve this?
Thanks in advance and best regards, Jörg.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/nek5000-users/attachments/20151017/278a8679/attachment.html>
More information about the Nek5000-users
mailing list