[petsc-users] Petsc without debugging enabled

Barry Smith bsmith at mcs.anl.gov
Wed Jun 3 13:50:50 CDT 2015


   Though you turned on various compiler optimizations you did not turn off the "extra" PETSc error checking that is enabled by default.  For optimized runs you should also use the argument --with-debugging=0 

  Barry


> On Jun 3, 2015, at 1:30 PM, Michele Rosso <mrosso at uci.edu> wrote:
> 
> Hi,
> 
> I am performing some timing runs with PETSc. I think I correctly compiled it with no debug mode, yet -log_summary gives me a warning:
> 
>       ##########################################################
>       #                                                        #
>       #                          WARNING!!!                    #
>       #                                                        #
>       #   This code was compiled with a debugging option,      #
>       #   To get timing results run ./configure                #
>       #   using --with-debugging=no, the performance will      #
>       #   be generally two or three times faster.              #
>       #                                                        #
>       ##########################################################
> 
> Here are my configure options ( from -log_summary ):
> 
> Compiled without FORTRAN kernels
> Compiled with full precision matrices (default)
> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
> Configure options: --known-level1-dcache-size=16384 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=4 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --with-batch="1 " --known-mpi-shared="0 " --known-mpi-shared-libraries=0 --known-memcmp-ok  --with-blas-lapack-lib="-L/opt/acml/5.3.1/gfortran64/lib  -lacml" --COPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --FOPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --CXXOPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --with-x="0 " --with-debugging=0 --with-clib-autodetect="0 " --with-cxxlib-autodetect="0 " --with-fortranlib-autodetect="0 " --with-shared-libraries="0 " --with-mpi-compilers="1 " --with-cc="cc " --with-cxx="CC " --with-fc="ftn " --download-hypre=1 --download-blacs="1 " --download-scalapack="1 " --download-superlu_dist="1 " --download-metis="1 " --download-parmetis="1 " PETSC_ARCH=gnu-opt-32idx
> 
> Libraries compiled on Wed Jun  3 12:14:19 2015 on h2ologin2
> Machine characteristics: Linux-3.0.101-0.46-default-x86_64-with-SuSE-11-x86_64
> Using PETSc directory: /mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4
> Using PETSc arch: gnu-opt-32idx
> -----------------------------------------
> 
> Using C compiler: cc  -march=bdver1 -fopenmp -O3 -ffast-math -fPIC  ${COPTFLAGS} ${CFLAGS}
> Using Fortran compiler: ftn  -march=bdver1 -fopenmp -O3 -ffast-math -fPIC   ${FOPTFLAGS} ${FFLAGS}
> -----------------------------------------
> 
> Using include paths: -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/include
> -----------------------------------------
> 
> Using C linker: cc
> Using Fortran linker: ftn
> Using libraries: -Wl,-rpath,/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -L/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -lpetsc -Wl,-rpath,/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -L/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -lsuperlu_dist_3.3 -lHYPRE -L/opt/acml/5.3.1/gfortran64/lib -lacml -lparmetis -lmetis -lpthread -lssl -lcrypto -ldl
> 
> What am I doing wrong?
> 
> Thanks,
> Michele
> 



More information about the petsc-users mailing list