[petsc-users] Problem with large grid size
Fazlul Huq
huq2090 at gmail.com
Fri Nov 30 11:27:20 CST 2018
Thanks.
Now I can run up to n=10^7 with -pc_type cholesky as well as -pc_type
hypre -pc_hypre_type boomeramg
Actually, I trying to find how much time and FLOPs count does it take to
solve a problem using multigrid for different sizes of the problem.
The problem I am trying to solve is attached herewith. I am not quiet sure
whether the way I have set the boundary conditions are correct or not!
(Because, BCs are given for x0 and xn. I am solving for x1 to x(n-1). But
setting BCs of x0 and xn)
This would be a great help for me if you can refer me some books so that I
can go through to make my understanding more clear.
Thanks.
Sincerely,
Huq
On Fri, Nov 30, 2018 at 10:46 AM Jed Brown <jed at jedbrown.org> wrote:
> "Smith, Barry F. via petsc-users" <petsc-users at mcs.anl.gov> writes:
>
> > You need to run it on more processors, this one processor doesn't
> have enough memory to fit the vectors (which by the way are huge
> 1,614,907,707,076)
>
> This is just a tridiagonal problem; I don't know why the vectors would
> be huge when the problem dimension is only a million.
>
> Fazlul, is this your target problem? The tridiagonal problem can be
> solved trivially using Cholesky (-pc_type cholesky; or LU) -- the n=10^7
> case takes perhaps a second in serial (and this can be made faster).
> The convergence is slow with local preconditioners because information
> travels only one element at a time. Note that I don't get any
> memory-related error messages with a 32-bit build of PETSc, but perhaps
> I'm not running the same way as you are.
>
> The errors reported by your code are wrong because your stated exact
> solution and RHS are not actually compatible.
>
> > Barry
> >
> >
> >> On Nov 29, 2018, at 8:26 PM, Fazlul Huq <huq2090 at gmail.com> wrote:
> >>
> >> Thanks.
> >>
> >> I have configured with 64-bit and then when I run, I got the following
> error message:
> >>
> ***********************************************************************************************************************
> >> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> >> [0]PETSC ERROR: Out of memory. This could be due to allocating
> >> [0]PETSC ERROR: too large an object or bleeding by not properly
> >> [0]PETSC ERROR: destroying unneeded objects.
> >> [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152
> >> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
> >> [0]PETSC ERROR: Memory requested 1614907707076
> >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
> >> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named
> huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018
> >> [0]PETSC ERROR: Configure options --download-hypre --download-mpich
> --with-64-bit-indices
> >> [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c
> >> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in
> /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c
> >> [0]PETSC ERROR: #3 PetscMallocA() line 397 in
> /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c
> >> [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c
> >> [0]PETSC ERROR: #5 VecSetType() line 51 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c
> >> [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c
> >> [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in
> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c
> >> [0]PETSC ERROR: #8 main() line 57 in
> /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c
> >> [0]PETSC ERROR: No PETSc Option Table entries
> >> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> >> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0
> >> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55
> >> :
> >> system msg for write_line failure : Bad file descriptor
> >>
> ************************************************************************************************************************************
> >>
> >> Thanks.
> >> Sincerely,
> >> Huq
> >>
> >> On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> >>
> >>
> >> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
> >> >
> >> > Hello PETSc Developers,
> >> >
> >> > I am trying to run the code (attached herewith) with the following
> command and it works until the size of the matrix is 99999X99999. But when
> I try to run with 999999X999999 then I got weird result.
> >>
> >> What is that "weird result"?
> >>
> >> My guess is for problems that large you need to ./configure PETSc
> with the additional option --with-64-bit-indices since for such large
> problems 32 bit integers are not large enough to contain values needed for
> storing and accessing the sparse matrix.
> >>
> >> Barry
> >>
> >> >
> >> > The command is:
> >> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg
> -ksp_view_solution
> >> > Any suggestions is appreciated.
> >> >
> >> > Thanks.
> >> > Sincerely,
> >> > Huq
> >> >
> >> >
> >> > --
> >> >
> >> > Fazlul Huq
> >> > Graduate Research Assistant
> >> > Department of Nuclear, Plasma & Radiological Engineering (NPRE)
> >> > University of Illinois at Urbana-Champaign (UIUC)
> >> > E-mail: huq2090 at gmail.com
> >> > <poisson_m.c>
> >>
> >>
> >>
> >> --
> >>
> >> Fazlul Huq
> >> Graduate Research Assistant
> >> Department of Nuclear, Plasma & Radiological Engineering (NPRE)
> >> University of Illinois at Urbana-Champaign (UIUC)
> >> E-mail: huq2090 at gmail.com
>
--
Fazlul Huq
Graduate Research Assistant
Department of Nuclear, Plasma & Radiological Engineering (NPRE)
University of Illinois at Urbana-Champaign (UIUC)
E-mail: huq2090 at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181130/098967a8/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: poisson_problem.jpg
Type: image/jpeg
Size: 2321815 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181130/098967a8/attachment-0001.jpg>
More information about the petsc-users
mailing list