<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Hi There,<br>
<br>
I have some parallel mpi output codes that works fine without PETSc
but crashes when compiled with PETSc. To make the problem easy, I
test the following example which has the same problem. This example
is modified form
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/research/projects/mpi/usingmpi2/examples/starting/io3f_f90.htm">http://www.mcs.anl.gov/research/projects/mpi/usingmpi2/examples/starting/io3f_f90.htm</a>.
It works without PETSc but if I comment out "use mpi" and add PETSc
include, it crashes at MPI_FILE_OPEN because of access violation.<br>
<br>
Shall I rewrite all the MPI Parallel output with <a
href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscBinaryOpen.html"
style="font-family: 'Times New Roman'; font-size: medium;
font-style: normal; font-variant: normal; font-weight: normal;
letter-spacing: normal; line-height: normal; orphans: auto;
text-align: start; text-indent: 0px; text-transform: none;
white-space: normal; widows: auto; word-spacing: 0px;
-webkit-text-stroke-width: 0px;">PetscBinaryOpen</a> or <a
href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerBinaryOpen.html"
style="font-family: 'Times New Roman'; font-size: medium;
font-style: normal; font-variant: normal; font-weight: normal;
letter-spacing: normal; line-height: normal; orphans: auto;
text-align: start; text-indent: 0px; text-transform: none;
white-space: normal; widows: auto; word-spacing: 0px;
-webkit-text-stroke-width: 0px;">PetscViewerBinaryOpen</a>
relative functions? Considering the parallel I/O efficiency, which
is more preferable?<br>
<br>
Thanks and regards,<br>
<br>
Danyang<br>
<br>
PROGRAM main <br>
! Fortran 90 users can (and should) use <br>
!use mpi <br>
! instead of include 'mpif.h' if their MPI implementation
provides a <br>
! mpi module. <br>
!include 'mpif.h' <br>
<br>
!For PETSc, use the following "include"s<br>
#include <finclude/petscsys.h><br>
#include <finclude/petscviewer.h><br>
#include <finclude/petscviewer.h90><br>
integer ierr, i, myrank, BUFSIZE, thefile <br>
parameter (BUFSIZE=10) <br>
integer buf(BUFSIZE) <br>
integer(kind=MPI_OFFSET_KIND) disp <br>
<br>
call MPI_INIT(ierr) <br>
call MPI_COMM_RANK(MPI_COMM_WORLD, myrank, ierr) <br>
<br>
do i = 1, BUFSIZE <br>
buf(i) = myrank * BUFSIZE + i <br>
enddo <br>
<br>
write(*,'(a,1x,i6,1x,a,1x,10(i6,1x))') "myrank", myrank,
"buf",buf<br>
<br>
call MPI_FILE_OPEN(MPI_COMM_WORLD, 'testfile.txt', & <br>
MPI_MODE_CREATE + MPI_MODE_WRONLY, & <br>
MPI_INFO_NULL, thefile, ierr) <br>
! assume 4-byte integers <br>
disp = myrank * BUFSIZE * 4 <br>
<br>
!Use the following two functions<br>
!call MPI_FILE_SET_VIEW(thefile, disp, MPI_INTEGER, & <br>
! MPI_INTEGER, 'native', & <br>
! MPI_INFO_NULL, ierr) <br>
!call MPI_FILE_WRITE(thefile, buf, BUFSIZE, MPI_INTEGER, & <br>
! MPI_STATUS_IGNORE, ierr) <br>
<br>
!Or use the following one function<br>
call MPI_FILE_WRITE_AT(thefile, disp, buf, BUFSIZE, MPI_INTEGER,
& <br>
MPI_STATUS_IGNORE, ierr)<br>
<br>
call MPI_FILE_CLOSE(thefile, ierr) <br>
call MPI_FINALIZE(ierr) <br>
<br>
END PROGRAM main <br>
</body>
</html>