[petsc-users] Fatal error in MPI_Allreduce: Error message texts are not available[cli_9]: aborting job:

Dominik Szczerba dominik at itis.ethz.ch
Sat Aug 6 01:19:17 CDT 2011


>> Also, run with -start_in_debugger and get a stack trace when it fails.
>>    Matt
>
> I get:
>
> [0]PETSC ERROR: PETSC: Attaching gdb to
> /home/dsz/build/framework-debug/trunk/bin/sm3t4mpi of pid 12238 on
> display localhost:11.0 on machine nexo

OK after long waiting all 12 gdb windows showed up. Starting the app
with its args leads to:

Reading symbols from /home/dsz/build/framework-debug/trunk/bin/sm3t4mpi...done.
Attaching to program:
/home/dsz/build/framework-debug/trunk/bin/sm3t4mpi, process 12241
ptrace: No such process.
/home/dsz/data/test-solve/SM/box/run3/12241: No such file or directory.
(gdb)  r run.xml
Starting program: /home/dsz/build/framework-debug/trunk/bin/sm3t4mpi run.xml
[Thread debugging using libthread_db enabled]
[cli_3]: PMIU_parse_keyvals: unexpected key delimiter at character 54 in cmd
[cli_3]: parse_kevals failed -1

Any thoughts?

Thanks
Dominik


More information about the petsc-users mailing list