[petsc-users] MatPtAPNumeric_MPIAIJ_MPIAIJ_scalable
Fande Kong
fdkong.jd at gmail.com
Mon Oct 1 10:01:15 CDT 2018
Thanks, Jed. I figured out. I just simply made P more sparse, then the
product would not take too much memory.
Fande,
On Fri, Sep 28, 2018 at 9:59 PM Jed Brown <jed at jedbrown.org> wrote:
> It depends entirely on your matrices. For example, if A is an arrowhead
> matrix (graph of a star -- one hub and many leaves) then A^2 is dense.
> If you have particular stencils for A and P, then we could tell you the
> fill ratio.
>
> Fande Kong <fdkong.jd at gmail.com> writes:
>
> > Hi All,
> >
> > I was wondering how much memory is required to get PtAP done? Do you have
> > any simple formula to this? So that I can have an estimate.
> >
> >
> > Fande,
> >
> >
> > [132]PETSC ERROR: --------------------- Error Message
> > --------------------------------------------------------------
> > [132]PETSC ERROR: Out of memory. This could be due to allocating
> > [132]PETSC ERROR: too large an object or bleeding by not properly
> > [132]PETSC ERROR: destroying unneeded objects.
> > [132]PETSC ERROR: Memory allocated 0 Memory used by process 3249920
> > [132]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
> > [132]PETSC ERROR: Memory requested 89148704
> > [132]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html
> > for trouble shooting.
> > [132]PETSC ERROR: Petsc Release Version 3.9.4, unknown
> > [132]PETSC ERROR: ../../rattlesnake-opt on a arch-theta-avx512-64-opt
> named
> > nid03830 by fdkong Fri Sep 28 22:43:45 2018
> > [132]PETSC ERROR: Configure options --LIBS=-lstdc++
> > --known-64-bit-blas-indices=0 --known-bits-per-byte=8
> > --known-has-attribute-aligned=1 --known-level1-dcache-assoc=8
> > --known-level1-dcache-linesize=64 --known-level1-dcache-size=32768
> > --known-memcmp-ok=1 --known-mklspblas-supports-zero-based=0
> > --known-mpi-c-double-complex=1 --known-mpi-int64_t=1
> > --known-mpi-long-double=1 --known-mpi-shared-libraries=0
> > --known-sdot-returns-double=0 --known-sizeof-MPI_Comm=4
> > --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8
> > --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8
> > --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8
> > --known-sizeof-void-p=8 --known-snrm2-returns-double=0 --with-batch=1
> > --with-blaslapack-lib="-mkl
> > -L/opt/intel/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64"
> > --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC
> > --with-cxxlib-autodetect=0 --with-debugging=0 --with-fc=ftn
> > --with-fortranlib-autodetect=0 --with-hdf5=0 --with-memalign=64
> > --with-mpiexec=aprun --with-shared-libraries=0 --download-metis=1
> > --download-parmetis=1 --download-superlu_dist=1 --download-hypre=1
> > --download-ptscotch=1 COPTFLAGS="-O3 -xMIC-AVX512" CXXOPTFLAGS="-O3
> > -xMIC-AVX512" FOPTFLAGS="-O3 -xMIC-AVX512"
> > PETSC_ARCH=arch-theta-avx512-64-opt --with-64-bit-indices=1
> > [132]PETSC ERROR: #1 PetscSegBufferCreate() line 64 in
> > /gpfs/mira-home/fdkong/petsc/src/sys/utils/segbuffer.c
> > [132]PETSC ERROR: #2 PetscSegBufferCreate() line 64 in
> > /gpfs/mira-home/fdkong/petsc/src/sys/utils/segbuffer.c
> > [132]PETSC ERROR: #3 PetscSegBufferExtractInPlace() line 227 in
> > /gpfs/mira-home/fdkong/petsc/src/sys/utils/segbuffer.c
> > [132]PETSC ERROR: #4 MatStashScatterBegin_BTS() line 854 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/utils/matstash.c
> > [132]PETSC ERROR: #5 MatStashScatterBegin_Private() line 461 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/utils/matstash.c
> > [132]PETSC ERROR: #6 MatAssemblyBegin_MPIAIJ() line 683 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/impls/aij/mpi/mpiaij.c
> > [132]PETSC ERROR: #7 MatAssemblyBegin() line 5158 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/interface/matrix.c
> > [132]PETSC ERROR: #8 MatPtAPNumeric_MPIAIJ_MPIAIJ_scalable() line 262 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/impls/aij/mpi/mpiptap.c
> > [132]PETSC ERROR: #9 MatPtAP_MPIAIJ_MPIAIJ() line 172 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/impls/aij/mpi/mpiptap.c
> > [132]PETSC ERROR: #10 MatPtAP() line 9182 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/interface/matrix.c
> > [132]PETSC ERROR: #11 MatGalerkin() line 10615 in
> > /gpfs/mira-home/fdkong/petsc/src/mat/interface/matrix.c
> > [132]PETSC ERROR: #12 PCSetUp_MG() line 730 in
> > /gpfs/mira-home/fdkong/petsc/src/ksp/pc/impls/mg/mg.c
> > [132]PETSC ERROR: #13 PCSetUp_HMG() line 336 in
> > /gpfs/mira-home/fdkong/petsc/src/ksp/pc/impls/hmg/hmg.c
> > [132]PETSC ERROR: #14 PCSetUp() line 923 in
> > /gpfs/mira-home/fdkong/petsc/src/ksp/pc/interface/precon.c
> > [132]PETSC ERROR: #15 KSPSetUp() line 381 in
> > /gpfs/mira-home/fdkong/petsc/src/ksp/ksp/interface/itfunc.c
> > [136]PETSC ERROR: --------------------- Error Message
> > ----------------------------------------------------------
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181001/93d73a38/attachment-0001.html>
More information about the petsc-users
mailing list