[petsc-users] Problem with large grid size

Fazlul Huq huq2090 at gmail.com
Thu Nov 29 21:06:44 CST 2018


Thanks a lot.

Sincerely,
Huq

On Thu, Nov 29, 2018 at 9:00 PM Smith, Barry F. <bsmith at mcs.anl.gov> wrote:

> No
>
>
> > On Nov 29, 2018, at 8:54 PM, Fazlul Huq <huq2090 at gmail.com> wrote:
> >
> > Sorry! I made mistake in running the code.
> >
> > It's actually working now until matrix size of 9999999!
> >
> > But when I try to extend to one order more it give me error message:
> >
> **************************************************************************************************************************
> > Out of memory trying to allocate 799999992 bytes
> > [0]PETSC ERROR:
> ------------------------------------------------------------------------
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> > [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> > [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
> > [0]PETSC ERROR: likely location of problem given in stack below
> > [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> > [0]PETSC ERROR:       INSTEAD the line number of the start of the
> function
> > [0]PETSC ERROR:       is given.
> > [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322
> /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c
> > [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138
> /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c
> > [0]PETSC ERROR: [0] PCSetUp line 894
> /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: [0] KSPSetUp line 304
> /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: [0] KSPSolve line 678
> /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Signal received
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
> > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named
> huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018
> > [0]PETSC ERROR: Configure options --download-hypre --download-mpich
> --with-64-bit-indices
> > [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59
> > :
> > system msg for write_line failure : Bad file descriptor
> >
> **************************************************************************************************************************************
> >
> > I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7
> processor.
> > Since the RAM is 32 GB, is there any way to allocate even higher order
> problem in my machine?
> >
> > Thanks.
> > Sincerely,
> > Huq
> >
> >
> >
> > On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq <huq2090 at gmail.com> wrote:
> > Thanks.
> >
> > I have configured with 64-bit and then when I run, I got the following
> error message:
> >
> ***********************************************************************************************************************
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Out of memory. This could be due to allocating
> > [0]PETSC ERROR: too large an object or bleeding by not properly
> > [0]PETSC ERROR: destroying unneeded objects.
> > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152
> > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
> > [0]PETSC ERROR: Memory requested 1614907707076
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
> > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named
> huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018
> > [0]PETSC ERROR: Configure options --download-hypre --download-mpich
> --with-64-bit-indices
> > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c
> > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in
> /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c
> > [0]PETSC ERROR: #3 PetscMallocA() line 397 in
> /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c
> > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c
> > [0]PETSC ERROR: #5 VecSetType() line 51 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c
> > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c
> > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c
> > [0]PETSC ERROR: #8 main() line 57 in
> /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c
> > [0]PETSC ERROR: No PETSc Option Table entries
> > [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0
> > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55
> > :
> > system msg for write_line failure : Bad file descriptor
> >
> ************************************************************************************************************************************
> >
> > Thanks.
> > Sincerely,
> > Huq
> >
> > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> >
> >
> > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
> > >
> > > Hello PETSc Developers,
> > >
> > > I am trying to run the code (attached herewith) with the following
> command and it works until the size of the matrix is 99999X99999. But when
> I try to run with 999999X999999 then I got weird result.
> >
> >    What is that "weird result"?
> >
> >    My guess is for problems that large you need to ./configure PETSc
> with the additional option --with-64-bit-indices since for such large
> problems 32 bit integers are not large enough to contain values needed for
> storing and accessing the sparse matrix.
> >
> >    Barry
> >
> > >
> > > The command is:
> > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg
> -ksp_view_solution
> > > Any suggestions is appreciated.
> > >
> > > Thanks.
> > > Sincerely,
> > > Huq
> > >
> > >
> > > --
> > >
> > > Fazlul Huq
> > > Graduate Research Assistant
> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE)
> > > University of Illinois at Urbana-Champaign (UIUC)
> > > E-mail: huq2090 at gmail.com
> > > <poisson_m.c>
> >
> >
> >
> > --
> >
> > Fazlul Huq
> > Graduate Research Assistant
> > Department of Nuclear, Plasma & Radiological Engineering (NPRE)
> > University of Illinois at Urbana-Champaign (UIUC)
> > E-mail: huq2090 at gmail.com
> >
> >
> > --
> >
> > Fazlul Huq
> > Graduate Research Assistant
> > Department of Nuclear, Plasma & Radiological Engineering (NPRE)
> > University of Illinois at Urbana-Champaign (UIUC)
> > E-mail: huq2090 at gmail.com
>
>

-- 

Fazlul Huq
Graduate Research Assistant
Department of Nuclear, Plasma & Radiological Engineering (NPRE)
University of Illinois at Urbana-Champaign (UIUC)
E-mail: huq2090 at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181129/3f0ed144/attachment.html>


More information about the petsc-users mailing list