[petsc-users] Crusher configure problem
Mark Adams
mfadams at lbl.gov
Fri Jan 28 12:18:22 CST 2022
And, building in my home directory looks fine. Looks like a problem with
the scratch directory.
On Fri, Jan 28, 2022 at 1:15 PM Mark Adams <mfadams at lbl.gov> wrote:
> Something is very messed up on Crusher. I've never seen this "Cannot
> allocate memory", but see it for everything:
>
> 13:11 1 main= crusher:/gpfs/alpine/csc314/scratch/adams/petsc2$ ll
> ls: cannot read symbolic link 'configure.log.bkp': Cannot allocate memory
> ls: cannot read symbolic link 'make.log': Cannot allocate memory
> ls: cannot read symbolic link 'configure.log': Cannot allocate memory
> total 25637
> drwxr-xr-x 8 adams adams 4096 Jan 20 08:51 arch-olcf-crusher
> drwxr-xr-x 8 adams adams 4096 Jan 18 20:23 arch-spock-amd
> drwxrwxr-x 5 adams adams 4096 Jan 26 19:10 arch-summit-dbg-gnu-cuda
> ....
>
> and
>
> 13:11 main= crusher:/gpfs/alpine/csc314/scratch/adams/petsc2$ git fetch
> remote: Enumerating objects: 648, done.
> remote: Counting objects: 100% (547/547), done.
> remote: Compressing objects: 100% (238/238), done.
> remote: Total 648 (delta 379), reused 442 (delta 308), pack-reused 101
> Receiving objects: 100% (648/648), 591.28 KiB | 1.38 MiB/s, done.
> Resolving deltas: 100% (402/402), completed with 114 local objects.
> error: cannot update the ref 'refs/remotes/origin/main': unable to append
> to '.git/logs/refs/remotes/origin/main': Cannot allocate memory
> From https://gitlab.com/petsc/petsc
>
> On Fri, Jan 28, 2022 at 1:07 PM Mark Adams <mfadams at lbl.gov> wrote:
>
>> Crusher has been giving me fits and now I get this error (empty log)
>>
>> 13:01 main *= crusher:/gpfs/alpine/csc314/scratch/adams/petsc$
>> ../arch-olcf-crusher.py
>> Traceback (most recent call last):
>> File "../arch-olcf-crusher.py", line 55, in <module>
>> configure.petsc_configure(configure_options)
>> AttributeError: module 'configure' has no attribute 'petsc_configure'
>>
>> This is a modified version of the repo configure file. This was working
>> this AM.
>> Any idea what could cause this?
>>
>> Thanks,
>> Mark
>>
>> #!/usr/bin/python3
>>
>> # Modules loaded by default (on login to Crusher):
>> #
>> # 1) craype-x86-trento 9) cce/13.0.0
>> # 2) libfabric/1.13.0.0 10) craype/2.7.13
>> # 3) craype-network-ofi 11) cray-dsmml/0.2.2
>> # 4) perftools-base/21.12.0 12) cray-libsci/21.08.1.2
>> # 5) xpmem/2.3.2-2.2_1.16__g9ea452c.shasta 13) PrgEnv-cray/8.2.0
>> # 6) cray-pmi/6.0.16 14) DefApps/default
>> # 7) cray-pmi-lib/6.0.16 15) rocm/4.5.0
>> # 8) tmux/3.2a 16) cray-mpich/8.1.12
>> #
>> # We use Cray Programming Environment, Cray compilers, Cray-mpich.
>> # To enable GPU-aware MPI, one has to also set this environment variable
>> #
>> # export MPICH_GPU_SUPPORT_ENABLED=1
>> #
>> # Additional note: If "craype-accel-amd-gfx90a" module is loaded (that is
>> # needed for "OpenMP offload") - it causes link errors when using 'cc or
>> hipcc'
>> # with fortran objs, hence not used
>> #
>>
>> if __name__ == '__main__':
>> import sys
>> import os
>> sys.path.insert(0, os.path.abspath('config'))
>> import configure
>> configure_options = [
>> '--with-cc=cc',
>> '--with-cxx=CC',
>> '--with-fc=ftn',
>> '--COPTFLAGS=-g -ggdb',
>> '--CXXOPTFLAGS=-g -ggdb',
>> '--FOPTFLAGS=-g',
>> '--with-fortran-bindings=0',
>> 'LIBS=-L{x}/gtl/lib
>> -lmpi_gtl_hsa'.format(x=os.environ['CRAY_MPICH_ROOTDIR']),
>> '--with-debugging=1',
>> #'--with-64-bit-indices=1',
>> '--with-mpiexec=srun -p batch -N 1 -A csc314_crusher -t 00:10:00',
>> '--with-hip',
>> '--with-hipc=hipcc',
>> '--download-hypre',
>> #'--download-hypre-commit=HEAD',
>> #'--download-hypre-configure-arguments=--enable-unified-memory',
>> #'--with-hypre-gpuarch=gfx90a',
>> '--with-hip-arch=gfx90a',
>> '--download-kokkos',
>> '--download-kokkos-kernels',
>> #'--with-kokkos-kernels-tpl=1',
>>
>> #'--prefix=/gpfs/alpine/world-shared/geo127/petsc/arch-crusher-opt-cray', #
>> /gpfs/alpine/phy122/proj-shared/petsc/current/arch-opt-amd-hypre',
>> ]
>> configure.petsc_configure(configure_options)
>>
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220128/a9e10dd2/attachment-0001.html>
More information about the petsc-users
mailing list