<div dir="ltr"><div class="gmail_default" style="font-family:comic sans ms,sans-serif">Thanks a lot.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif">Sincerely,</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif">Huq</div></div><br><div class="gmail_quote"><div dir="ltr">On Thu, Nov 29, 2018 at 9:00 PM Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">No<br>
<br>
<br>
> On Nov 29, 2018, at 8:54 PM, Fazlul Huq <<a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a>> wrote:<br>
> <br>
> Sorry! I made mistake in running the code.<br>
> <br>
> It's actually working now until matrix size of 9999999!<br>
> <br>
> But when I try to extend to one order more it give me error message:<br>
> **************************************************************************************************************************<br>
> Out of memory trying to allocate 799999992 bytes<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
> [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [0]PETSC ERROR: likely location of problem given in stack below<br>
> [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> [0]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> [0]PETSC ERROR: is given.<br>
> [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c<br>
> [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c<br>
> [0]PETSC ERROR: [0] PCSetUp line 894 /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c<br>
> [0]PETSC ERROR: [0] KSPSetUp line 304 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: [0] KSPSolve line 678 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Signal received<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 <br>
> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018<br>
> [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices<br>
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59<br>
> :<br>
> system msg for write_line failure : Bad file descriptor<br>
> **************************************************************************************************************************************<br>
> <br>
> I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 processor. <br>
> Since the RAM is 32 GB, is there any way to allocate even higher order problem in my machine?<br>
> <br>
> Thanks.<br>
> Sincerely,<br>
> Huq<br>
> <br>
> <br>
> <br>
> On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq <<a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a>> wrote:<br>
> Thanks.<br>
> <br>
> I have configured with 64-bit and then when I run, I got the following error message:<br>
> ***********************************************************************************************************************<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Out of memory. This could be due to allocating<br>
> [0]PETSC ERROR: too large an object or bleeding by not properly<br>
> [0]PETSC ERROR: destroying unneeded objects.<br>
> [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152<br>
> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.<br>
> [0]PETSC ERROR: Memory requested 1614907707076<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 <br>
> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018<br>
> [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices<br>
> [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c<br>
> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c<br>
> [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c<br>
> [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c<br>
> [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c<br>
> [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c<br>
> [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c<br>
> [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c<br>
> [0]PETSC ERROR: No PETSc Option Table entries<br>
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>
> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0<br>
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55<br>
> :<br>
> system msg for write_line failure : Bad file descriptor<br>
> ************************************************************************************************************************************<br>
> <br>
> Thanks.<br>
> Sincerely,<br>
> Huq<br>
> <br>
> On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>> wrote:<br>
> <br>
> <br>
> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:<br>
> > <br>
> > Hello PETSc Developers,<br>
> > <br>
> > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result.<br>
> <br>
> What is that "weird result"? <br>
> <br>
> My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix.<br>
> <br>
> Barry<br>
> <br>
> > <br>
> > The command is: <br>
> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution<br>
> > Any suggestions is appreciated.<br>
> > <br>
> > Thanks.<br>
> > Sincerely,<br>
> > Huq<br>
> > <br>
> > <br>
> > -- <br>
> > <br>
> > Fazlul Huq<br>
> > Graduate Research Assistant<br>
> > Department of Nuclear, Plasma & Radiological Engineering (NPRE)<br>
> > University of Illinois at Urbana-Champaign (UIUC)<br>
> > E-mail: <a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a><br>
> > <poisson_m.c><br>
> <br>
> <br>
> <br>
> -- <br>
> <br>
> Fazlul Huq<br>
> Graduate Research Assistant<br>
> Department of Nuclear, Plasma & Radiological Engineering (NPRE)<br>
> University of Illinois at Urbana-Champaign (UIUC)<br>
> E-mail: <a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a><br>
> <br>
> <br>
> -- <br>
> <br>
> Fazlul Huq<br>
> Graduate Research Assistant<br>
> Department of Nuclear, Plasma & Radiological Engineering (NPRE)<br>
> University of Illinois at Urbana-Champaign (UIUC)<br>
> E-mail: <a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a><br>
<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><br><div><font face="comic sans ms, sans-serif">Fazlul Huq</font></div><div><font face="comic sans ms, sans-serif">Graduate Research Assistant</font></div><div><font face="comic sans ms, sans-serif">Department of Nuclear, Plasma & Radiological Engineering (NPRE)</font></div><div><font face="comic sans ms, sans-serif">University of Illinois at Urbana-Champaign (UIUC)</font></div><div><font face="comic sans ms, sans-serif">E-mail: <a href="mailto:huq2090@gmail.com" target="_blank">huq2090@gmail.com</a></font></div></div></div></div></div></div></div>