[petsc-users] ERROR: Scalar value must be same on all processes

Vijay Gopal Chilkuri vijay.gopal.c at gmail.com
Tue May 26 00:41:04 CDT 2015


Thanks Barry,

You were right ! when i removed the COPTFLAGS and recompiled the same code
worked.
It now seems to be working perfectly.

So somehow the optimization flags effect the code execution !?
is this a known issue ?


thanks anyways,
 Vijay

On Sat, May 23, 2015 at 9:04 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    This error message is almost always because a Nan or Inf has crept into
> the numbers and then gotten into a norm or inner product.
> You can run with -fp_trap or run a debugger and have it catch floating
> point exceptions to see the first point where the Nan or Inf appeared and
> help track down the cause.  But do all this tracking down using a debug
> version of the library without all the extra optimization flags you have
> put into the COPTFLAGS flags etc.
>
>   Barry
>
> > On May 23, 2015, at 12:27 PM, Vijay Gopal Chilkuri <
> vijay.gopal.c at gmail.com> wrote:
> >
> > Hi,
> >
> > I'm having problems calculating large number of eigenvalues(nev>400)
> with SLEPc, i get perfect results with small number of eigenvalues.
> >
> >
> > PETSc: Branch: origin/maint ; commit: 2b04bc0
> > SLEPc: Branch: origin/maint ; commit: e1f03d9
> >
> > This is the error message i get:
> >
> > Intel Parallel Studio XE 2013 loaded
> > Intel(R) MPI Library 4.1 (4.1.3.049) loaded
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Invalid argument
> > [0]PETSC ERROR: Scalar value must be same on all processes, argument # 3
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.5.3, unknown¬
> > [0]PETSC ERROR: ./ex1 on a arch-linux2-c-opt named eoscomp7 by vijayc
> Sat May 23 19:04:55 2015
> > [0]PETSC ERROR: Configure options --with-64-bit-indices --with-cc=mpiicc
> --with-fc=mpiifort --with-cxx=mpiicpc --with-debugging=1 --FOPTFLAGS="-O3
> -xAV
> > X -fno-alias -no-prec-div -no-prec-sqrt -ip " --COPTFLAGS="-O3 -xAVX
> -fno-alias -no-prec-div -no-prec-sqrt -ip " --CXXOPTFLAGS="-O3 -xAVX
> -fno-alias -no-prec-div -no-prec-sqrt -ip " --download-fblaslapack
> --with-x=false
> > [0]PETSC ERROR: #1 BVScaleColumn() line 380 in
> /eos1/p1517/vijayc/slepc_basic/src/sys/classes/bv/interface/bvops.c
> > [0]PETSC ERROR: #2 EPSFullLanczos() line 200 in
> /eos1/p1517/vijayc/slepc_basic/src/eps/impls/krylov/krylov.c
> > [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Symm() line 56 in
> /eos1/p1517/vijayc/slepc_basic/src/eps/impls/krylov/krylovschur/ks-symm.c
> > [0]PETSC ERROR: #4 EPSSolve() line 99 in
> /eos1/p1517/vijayc/slepc_basic/src/eps/interface/epssolve.c
> > [0]PETSC ERROR: #5 main() line 143 in
> /users/p1517/vijayc/modelization/ntrou3/sites_18/obc_10/isz_0/ex1.c
> > [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> > application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
> >
> > thanks a lot,
> >  Vijay
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150526/8af028d3/attachment.html>


More information about the petsc-users mailing list