[petsc-users] PETSC ERROR: Logging has not been enabled

Bishesh Khanal bisheshkh at gmail.com
Thu Aug 22 01:39:08 CDT 2013


On Thu, Aug 22, 2013 at 12:12 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    Most likely the tool you are using to launch the parallel program is
> wrong for the MPI you have linked PETSc with. Are you starting the program
> with mpiexec ? Is that mpiexec the one that goes with the MPI (mpicc or
> mpif90) that you built PETSc with?
>

Thanks Barry, I'm using the bin/petscmpiexec of the petsc install.
But this is super strange, after having the errors yesterday I gave up and
slept. Now, I just turned on the computer and rebuilt the project and run
the same thing again and it works without any errors. This is very very
strange!

>
>    What happens if you compile a trivial MPI only code with the mpicc and
> then try to run it in parallel with the mpiexec?
>
>
>    Barry
>
> On Aug 21, 2013, at 5:05 PM, Bishesh Khanal <bisheshkh at gmail.com> wrote:
>
> > Dear all,
> > My program runs fine when using just one processor, valgrind shows no
> errors too, but when using more than one processor I get the following
> errors:
> >
> > [0]PETSC ERROR: PetscOptionsInsertFile() line 461 in
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c
> > [0]PETSC ERROR: PetscOptionsInsert() line 623 in
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c
> > [0]PETSC ERROR: PetscInitialize() line 769 in
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/pinit.c
> > PETSC ERROR: Logging has not been enabled.
> > You might have forgotten to call PetscInitialize().
> > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> > [cli_0]: aborting job:
> > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> >
> >
> ===================================================================================
> > =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > =   EXIT CODE: 56
> > =   CLEANING UP REMAINING PROCESSES
> > =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> >
> ===================================================================================
> >
> > I have not forgotten to call PetscInitialize, if that helps!
> > Thanks,
> > Bishesh
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130822/3c23deba/attachment.html>


More information about the petsc-users mailing list