bombing out writing large scratch files

Randall Mackie randy at geosystem.us
Sat May 27 17:00:18 CDT 2006


In my PETSc based modeling code, I write out intermediate results to a scratch
file, and then read them back later. This has worked fine up until today,
when for a large model, this seems to be causing my program to crash with
errors like:

------------------------------------------------------------------------
[9]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range


I've tracked down the offending code to:

           IF (rank == 0) THEN
             irec=(iper-1)*2+ipol
             write(7,rec=irec) (xvec(i),i=1,np)
           END IF

It writes out xvec for the first record, but then on the second
record my program is crashing.

The record length (from an inquire statement) is  recl     22626552

The size of the scratch file when my program crashes is 98M.

PETSc is compiled using the intel compilers (v9.0 for fortran),
and the users manual says that you can have record lengths of
up to 2 billion bytes.

I'm kind of stuck as to what might be the cause. Any ideas from anyone
would be greatly appreciated.

Randy Mackie

ps. I've tried both the optimized and debugging versions of the PETSc
libraries, with the same result.


-- 
Randall Mackie
GSY-USA, Inc.
PMB# 643
2261 Market St.,
San Francisco, CA 94114-1600
Tel (415) 469-8649
Fax (415) 469-5044

California Registered Geophysicist
License No. GP 1034




More information about the petsc-users mailing list