[petsc-users] slepc trap for large matrix
Kannan, Ramakrishnan
kannanr at ornl.gov
Wed Jun 7 09:41:56 CDT 2017
Jose,
I am running in the super computer environment. I just do a “module load cray-petsc-64/3.7.4.0”. I don’t compile PETSc.
--
Regards,
Ramki
On 6/7/17, 10:41 AM, "Jose E. Roman" <jroman at dsic.upv.es> wrote:
This option belongs to PETSc's configure, not SLEPc's configure.
Jose
> El 7 jun 2017, a las 16:37, Kannan, Ramakrishnan <kannanr at ornl.gov> escribió:
>
> Barry,
>
> Thanks for the kind response. I am building slepc 3.7.3 and when I configure –with-64-bit-indices=1, I am getting the following error.
>
> ./configure --with-64-bit-indices=1 --prefix=/lustre/atlas/proj-shared/csc209/ramki/slepc
> ERROR: Invalid arguments --with-64-bit-indices=1
> Use -h for help
>
> When I run ./configure –h, I am getting the following options. Let me know if I am missing something.
>
> SLEPc Configure Help
> --------------------------------------------------------------------------------
> SLEPc:
> --with-clean=<bool> : Delete prior build files including externalpackages
> --with-cmake=<bool> : Enable builds with CMake (disabled by default)
> --prefix=<dir> : Specify location to install SLEPc (e.g., /usr/local)
> --DATAFILESPATH=<dir> : Specify location of datafiles (for SLEPc developers)
> ARPACK:
> --download-arpack[=<fname>] : Download and install ARPACK in SLEPc directory
> --with-arpack=<bool> : Indicate if you wish to test for ARPACK
> --with-arpack-dir=<dir> : Indicate the directory for ARPACK libraries
> --with-arpack-flags=<flags> : Indicate comma-separated flags for linking ARPACK
> BLOPEX:
> --download-blopex[=<fname>] : Download and install BLOPEX in SLEPc directory
> BLZPACK:
> --with-blzpack=<bool> : Indicate if you wish to test for BLZPACK
> --with-blzpack-dir=<dir> : Indicate the directory for BLZPACK libraries
> --with-blzpack-flags=<flags> : Indicate comma-separated flags for linking BLZPACK
> FEAST:
> --with-feast=<bool> : Indicate if you wish to test for FEAST
> --with-feast-dir=<dir> : Indicate the directory for FEAST libraries
> --with-feast-flags=<flags> : Indicate comma-separated flags for linking FEAST
> PRIMME:
> --download-primme[=<fname>] : Download and install PRIMME in SLEPc directory
> --with-primme=<bool> : Indicate if you wish to test for PRIMME
> --with-primme-dir=<dir> : Indicate the directory for PRIMME libraries
> --with-primme-flags=<flags> : Indicate comma-separated flags for linking PRIMME
> TRLAN:
> --download-trlan[=<fname>] : Download and install TRLAN in SLEPc directory
> --with-trlan=<bool> : Indicate if you wish to test for TRLAN
> --with-trlan-dir=<dir> : Indicate the directory for TRLAN libraries
> --with-trlan-flags=<flags> : Indicate comma-separated flags for linking TRLAN
> SOWING:
> --download-sowing[=<fname>] : Download and install SOWING in SLEPc directory
>
> --
> Regards,
> Ramki
>
>
> On 6/6/17, 9:06 PM, "Barry Smith" <bsmith at mcs.anl.gov> wrote:
>
>
> The resulting matrix has something like
>
>>>> 119999808*119999808*1.e-6
> 14,399,953,920.036863
>
> nonzero entries. It is possible that some integer operations are overflowing since C int can only go up to about 4 billion before overflowing.
>
> You can building with a different PETSC_ARCH value using the additional ./configure option for PETSc of --with-64-bit-indices and see if the problem is resolved.
>
> Barry
>
>
>> On Jun 5, 2017, at 12:37 PM, Kannan, Ramakrishnan <kannanr at ornl.gov> wrote:
>>
>> I am running EPS for NHEP on a matrix of size 119999808x119999808 and I am experiencing the attached trapped. This is a 1D row distributed sparse uniform random matrix with 1e-6 sparsity over 36 processors. It works fine for smaller matrices of sizes with 1.2 million x 1.2 million. Let me know if you are looking for more information.
>>
>> --
>> Regards,
>> Ramki
>>
>> <slepc.e609742.zip>
>
>
>
>
More information about the petsc-users
mailing list