bombing out writing large scratch files

Barry Smith bsmith at mcs.anl.gov
Sat May 27 17:50:26 CDT 2006


   You can also add -debugger_nodes 9  to indicate you only want
node 9 in the debugger; otherwise you get lots of xterms poping up.

    Barry


On Sat, 27 May 2006, Matthew Knepley wrote:

> 1) Make sure ssh is forwarding X (-Y I think)
>
> 2) -start_in_debugger
>
> 3) -display <your machine>:0.0
>
> should do it.
>
>  Matt
>
> On 5/27/06, Randall Mackie <randy at geosystem.us> wrote:
>> 
>> This is a stupid question, but how do I start in the debugger if I'm
>> running
>> on a cluster half-way around the world and I'm working on that cluster
>> via ssh?
>> 
>> Randy
>> 
>> 
>> Matthew Knepley wrote:
>> > The best thing to do here is get a stack trace from the debugger. From
>> the
>> > description, it is hard to tell what statement is trying to access which
>> > illegal
>> > memory.
>> >
>> >    Matt
>> >
>> > On 5/27/06, *Randall Mackie* <randy at geosystem.us
>> > <mailto:randy at geosystem.us>> wrote:
>> >
>> >     In my PETSc based modeling code, I write out intermediate results to
>> >     a scratch
>> >     file, and then read them back later. This has worked fine up until
>> >     today,
>> >     when for a large model, this seems to be causing my program to crash
>> >     with
>> >     errors like:
>> >
>> >
>> ------------------------------------------------------------------------
>> >     [9]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>> >     Violation, probably memory access out of range
>> >
>> >
>> >     I've tracked down the offending code to:
>> >
>> >                IF (rank == 0) THEN
>> >                  irec=(iper-1)*2+ipol
>> >                  write(7,rec=irec) (xvec(i),i=1,np)
>> >                END IF
>> >
>> >     It writes out xvec for the first record, but then on the second
>> >     record my program is crashing.
>> >
>> >     The record length (from an inquire statement) is  recl     22626552
>> >
>> >     The size of the scratch file when my program crashes is 98M.
>> >
>> >     PETSc is compiled using the intel compilers ( v9.0 for fortran),
>> >     and the users manual says that you can have record lengths of
>> >     up to 2 billion bytes.
>> >
>> >     I'm kind of stuck as to what might be the cause. Any ideas from
>> anyone
>> >     would be greatly appreciated.
>> >
>> >     Randy Mackie
>> >
>> >     ps. I've tried both the optimized and debugging versions of the
>> PETSc
>> >     libraries, with the same result.
>> >
>> >
>> >     --
>> >     Randall Mackie
>> >     GSY-USA, Inc.
>> >     PMB# 643
>> >     2261 Market St.,
>> >     San Francisco, CA 94114-1600
>> >     Tel (415) 469-8649
>> >     Fax (415) 469-5044
>> >
>> >     California Registered Geophysicist
>> >     License No. GP 1034
>> >
>> >
>> >
>> >
>> > --
>> > "Failure has a thousand explanations. Success doesn't need one" -- Sir
>> > Alec Guiness
>> 
>> --
>> Randall Mackie
>> GSY-USA, Inc.
>> PMB# 643
>> 2261 Market St.,
>> San Francisco, CA 94114-1600
>> Tel (415) 469-8649
>> Fax (415) 469-5044
>> 
>> California Registered Geophysicist
>> License No. GP 1034
>> 
>> 
>
>
>




More information about the petsc-users mailing list