[petsc-users] ERROR: Scalar value must be same on all processes

Vijay Gopal Chilkuri vijay.gopal.c at gmail.com
Sat May 23 12:27:56 CDT 2015


Hi,

I'm having problems calculating large number of eigenvalues(nev>400) with
SLEPc, i get perfect results with small number of eigenvalues.


*PETSc: Branch: origin/maint ; commit: 2b04bc0*
*SLEPc: Branch: origin/maint ; commit: e1f03d9*


This is the error message i get:

*Intel Parallel Studio XE 2013 loaded*
*Intel(R) MPI Library 4.1 (4.1.3.049) loaded*
*[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------*
*[0]PETSC ERROR: Invalid argument*
*[0]PETSC ERROR: Scalar value must be same on all processes, argument # 3 *
*[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
<http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.*
*[0]PETSC ERROR: Petsc Release Version 3.5.3, unknown¬*
*[0]PETSC ERROR: ./ex1 on a arch-linux2-c-opt named eoscomp7 by vijayc Sat
May 23 19:04:55 2015*
*[0]PETSC ERROR: Configure options --with-64-bit-indices --with-cc=mpiicc
--with-fc=mpiifort --with-cxx=mpiicpc --with-debugging=1 --FOPTFLAGS="-O3
-xAV*
*X -fno-alias -no-prec-div -no-prec-sqrt -ip " --COPTFLAGS="-O3 -xAVX
-fno-alias -no-prec-div -no-prec-sqrt -ip " --CXXOPTFLAGS="-O3 -xAVX
-fno-alias -no-prec-div -no-prec-sqrt -ip " --download-fblaslapack
--with-x=false*
*[0]PETSC ERROR: #1 BVScaleColumn() line 380 in
/eos1/p1517/vijayc/slepc_basic/src/sys/classes/bv/interface/bvops.c*
*[0]PETSC ERROR: #2 EPSFullLanczos() line 200 in
/eos1/p1517/vijayc/slepc_basic/src/eps/impls/krylov/krylov.c*
*[0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Symm() line 56 in
/eos1/p1517/vijayc/slepc_basic/src/eps/impls/krylov/krylovschur/ks-symm.c*
*[0]PETSC ERROR: #4 EPSSolve() line 99 in
/eos1/p1517/vijayc/slepc_basic/src/eps/interface/epssolve.c*
*[0]PETSC ERROR: #5 main() line 143 in
/users/p1517/vijayc/modelization/ntrou3/sites_18/obc_10/isz_0/ex1.c*
*[0]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------*
*application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0*

thanks a lot,
 Vijay
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150523/db4fcff0/attachment.html>


More information about the petsc-users mailing list