[petsc-users] strange out memory issues or bus errors when increasing pb size
Aurelien Ponte
aurelien.ponte at ifremer.fr
Fri Dec 16 11:59:59 CST 2016
Thanks Barry for the prompt reply.
I guess I'll have to recompile codes then.
I am actually running the code on a cluster and you'll find below a list
of the modules installed.
If anyone has any tips on what are my best options in order to compile
petsc4py and petsc (and numpy, netcdf4-python actually) with this
configuration, I'll be most grateful.
thanks
aurelien
service7>414% module avail
-------------------------- /appli/modulefiles ---------------------------
Automake/1.14_gcc-4.8.0
BANDELA/1011
Bash/4.3_gcc-5.3.0
EcmwfTools/emos__000392-intel-12.1.5
EcmwfTools/grib_api__1.11.0-intel-12.1.5
Go/1.1.1
Go/1.2.1
Go/1.3
Go/1.4.0
Go/1.4.1
Go/1.5.1
GrADS/grads-2.0.2
JAGS/3.3.0-intel-11.1.073
JAGS/4.0.0-intel-15.0.090-seq
JAGS/4.0.0-intel-15.0.090-thread
Latex/20150521__gcc-4.9.2
MPlayer/1.2__gcc-4.9.2
Migrate-N/3.3.2__gcc-4.7.1_mpt-2.06
Octave/4.0.2_gcc-5.3.0
OpenBUGS/3.2.1-intel-11.1.073
Perl/5.22.1_gcc-5.3.0
R/2.11.1-gnu
R/2.11.1-intel
R/2.14.2-gnu-4.3
R/2.14.2-intel-11.1.073
R/2.15.0-gnu-4.3
R/2.15.0-intel-11.1.073
R/2.15.3-gnu-4.8.0
R/2.15.3-intel-12.1.5
R/3.0.1-intel-12.1.5
R/3.0.2-intel-14.0.0
R/3.0.3-intel-12.1.5
R/3.0.3-intel-14.0.0
R/3.1.2-intel-15.0.090
R/3.2.2-intel-12.1.5
R/3.2.3-intel-12.1.5
R/3.2.4-intel-12.1.5
R/patched-2.14.1-25.1-gnu
Saturne/4.0.3_2015.3.187_5.0.3.048
SpecFEM2D/20141231-intel-15.0.90
TVD/8.2.0-1
TVD/8.4.1-5
TVD/8.6.0-2
TVD/recent
anaconda/3
anaconda/uv2
cdo/1.5.6.1_gcc-4.7.0
cmake/2.8.8
code_aster/10.6.0-3
cuda/2.10
cuda/2.3
cuda/4.2.9
cuda/5.5.25
ddt/2.6
ddt/4.1.1
ddt/4.2.2
exiv2/0.24-gcc-4.8.2
ffmpeg/ffmpeg-2.2.1
gcc/4.2.2
gcc/4.7.0
gcc/4.7.1
gcc/4.8.0
gcc/4.8.2
gcc/4.8.4
gcc/4.9.2
gcc/5.3.0
gerris/1.3.2
gperf/3.0.4
gsl/1.14-intel-11.1.073
gsl/1.15-gcc-4.6.3
gsl/1.15-intel-12.1.5
hdf5/1.8.8-intel-11.1.073
hdf5/hdf5-1.8.12_intel-14.0.0
hdf5/hdf5-1.8.12_intel-14.0.0_mpi-4.0.0.028
hdf5/intel-10.1.008
hmpp/2.0.0
hmpp/2.1.0sp1
hmpp/2.2.0
idv/3.1
intel-comp/11.1.073
intel-comp/12.1.5
intel-comp/14.0.0
intel-comp/2015.0.090
intel-comp/2015.3.187
intel-mpi/4.0.0.028
intel-mpi/5.0.3.048
java/1.5.0
java/1.6.0
java/1.7.0
java/1.8.0
matlab/2006b
matlab/2007b
matlab/2009b
matlab/2011b
matlab/2013b
matlab/2013c
mkl/10.3
mpinside/3.5.3
mpt/0test
mpt/1.21
mpt/1.23
mpt/1.24
mpt/1.25
mpt/2.01
mpt/2.04
mpt/2.06
mpt/2.08
ncarg-4.2.2/gnu
ncarg-4.2.2/intel-10.0.025
ncarg-4.2.2/intel-10.0.026
ncarg-4.2.2/intel-10.1.008
ncarg-4.2.2/pgi-7.1-1
ncltest/5.2.1
ncltest/6.0.0
nco/4.2.1_gcc-4.7.0
nco/4.3.4-intel12.1.5
nco/4.4.2_gcc-4.8.0
ncview/2.1.2_gcc-4.7.0
ncview/2.1.2_intel
netCDF/4.0-intel-11.1.073
netCDF/4.1.3-intel-11.1.073
netCDF/4.2.1-gcc-4.7.0
netCDF/4.2.1-intel-11.1.073
netCDF/4.2.1-intel-11.1.073_mpi-4.0.0.028
netCDF/4.2.1-intel-11.1.073_mpt-2.06
netCDF/4.2.1-intel-12.1.5
netCDF/4.2.1-intel-12.1.5_mpt-2.06
netCDF/4.2.1.1-intel-12.1.5_mpt-2.06
netCDF/4.2.1.1-intel-12.1.5_mpt-2.06_p
netCDF/4.2.1.1-intel-12.1.5_mpt-2.06_pp
netCDF/4.2.1.1-intel-12.1.5_mpt-2.06_ppp
netCDF/impi5
netCDF/impi5-debug
netCDF/impi5_4.2
netcdf-gcc/3.6.2
netcdf-gcc/3.6.3
netcdf-intel/3.6.3-11.1.073
netcdf-intel/3.6.3-11.1.073-fpic
netcdf-pgi/3.6.2-7.0-7
netcdf-pgi/3.6.2-7.1-1
old/R/2.1.1
old/R/2.8.1
old/cmkl/10.0.011
old/cmkl/10.0.3.020
old/cmkl/10.0.5.025
old/cmkl/10.1.3.027
old/cmkl/10.2.2.025
old/cmkl/9.1.021
old/cmkl/phase2
old/cmkl/recent
old/ddt/2.1
old/ddt/2.2
old/ddt/2.3
old/ddt/2.4.1
old/ddt/2.5.1
old/ddt/recent
old/gcc/4.2.2
old/intel/10.0.025
old/intel/10.0.026
old/intel/10.1.008
old/intel/10.1.015
old/intel/10.1.018
old/intel/newest
old/intel/recent
old/intel-cc/10.0.025
old/intel-cc/10.0.026
old/intel-cc/10.1.008
old/intel-cc/10.1.015
old/intel-cc/10.1.018
old/intel-cc/11.0.081
old/intel-cc/11.0.083
old/intel-cc/11.1.038
old/intel-cc/11.1.073
old/intel-cc/9.1.045
old/intel-cc-10/10.0.025
old/intel-comp/11.0.081
old/intel-comp/11.0.083
old/intel-comp/11.1.038
old/intel-comp/11.1.046
old/intel-comp/11.1.059
old/intel-fc/10.0.025
old/intel-fc/10.0.026
old/intel-fc/10.1.008
old/intel-fc/10.1.015
old/intel-fc/10.1.018
old/intel-fc/11.0.081
old/intel-fc/11.0.083
old/intel-fc/11.1.038
old/intel-fc/11.1.073
old/intel-fc/9.1.045
old/intel-fc-10/10.0.025
old/intel-mpi/3.0.043
old/intel-mpi/3.1
old/intel-mpi/3.2.0.011
old/intel-mpi/3.2.1.009
old/intel-mpi/3.2.2.006
old/mvapich2/intel
old/netcdf-intel/3.6.2-10.0.025
old/netcdf-intel/3.6.2-10.0.026
old/netcdf-intel/3.6.2-10.1.008
old/netcdf-intel/3.6.3-11.1.038
old/netcdf-intel-10/3.6.2
old/openmpi/intel
pfmt/1.3
pgi/16.3
pgi/7.0-7
pgi/7.1-1
pgi/7.1-2
pgi/7.2
pgi/8.0-4
pgi/pgi/16.3
pgi/pgi/8.0-6
pgi/pgi32/8.0-6
pgi/pgi64/8.0-6
pnetCDF/pnetcdf-1.3.1__intel-12.1.5_mpi-4.0.0.028
proj/4.8.0-intel-12.1.5
python/2.7.10_gnu-4.9.2
python/2.7.3_gnu-4.7.0
python/2.7.3_gnu-4.7.1
python/2.7.5_gnu-4.8.0
scilab/scilab-5.4.1
szip/2.1-intel-11.1.073
udunits/1.12.11-intel-11.1.073
udunits/2.1.19-intel-11.1.073
udunits/2.1.24-intel-12.1.5
unigifsicle/1.39-719.16
uv-cdat/1.0.1next
valgrind/valgrind-3.11.0__gcc-4.9.2
valgrind/valgrind-3.11.0__gcc-4.9.2__intel-mpi.5.0.3.048
valgrind/valgrind-3.8.1__gcc.4.8.0
vtune/2013
wgrib2/netcdf3/intel-11.1.073
xios/1.0
zlib/1.2.6-intel-11.1.073
Le 16/12/2016 à 15:56, Barry Smith a écrit :
>> On Dec 16, 2016, at 8:45 AM, Aurelien Ponte <aurelien.ponte at ifremer.fr> wrote:
>>
>> Hi,
>>
>> I am inverting a 3D elliptical operator with petsc4py (3.4, petsc is 3.4.5)
>> installed via conda: https://anaconda.org/sed-pro-inria/petsc4py
>>
>> I get systematic crashes (out of memory or bus error) when I reach a certain
>>
>> grid size (512 x 256 x 100) even though I maintain the same number of grid points per processor.
>> I used up to 256 procs and got similar crashes.
>>
>> Browsing the internet indicates that using 64-bit-indices may be cure for
>> such pb.
> Yes, this is the problem.
>
>> It will take however a significant amount of effort for me to install
>> petsc4py and petsc with this option.
> Hopefully another user knows an easy way to install petsc4py to use 64 bit indices
>
>> I do not even know how to check whether
>> my current versions of petsc4py and petsc was installed with it.
> It was not.
>> Would you have any tips or recommendation about how I could address the issue ?
>>
>> Thanks,
>>
>> Aurelien
>>
>>
>> --
>> Aurélien Ponte
>> Tel: (+33) 2 98 22 40 73
>> Fax: (+33) 2 98 22 44 96
>> UMR 6523, IFREMER
>> ZI de la Pointe du Diable
>> CS 10070
>> 29280 Plouzané
>>
--
Aurélien Ponte
Tel: (+33) 2 98 22 40 73
Fax: (+33) 2 98 22 44 96
UMR 6523, IFREMER
ZI de la Pointe du Diable
CS 10070
29280 Plouzané
More information about the petsc-users
mailing list