[petsc-users] [petsc-maint] Out of memory issue related to KSP.

Barry Smith bsmith at petsc.dev
Thu Jul 18 17:57:14 CDT 2024


  Sparse matrix factorizations generally take much less memory than dense factorizations. How much memory they require depends on the number of nonzeros in the original matrix and the nonzero pattern. The amount of memory required for the factorization can easily be 5 to 10 times that of the original matrix. You can start with smaller problems to "get a feeling" for how much memory is required for the factorization and then gradually increase the problem size to see what is achievable. For large problems with a direct solver, you will want to use MUMPS (./configure --download-mumps) and run it on multiple compute nodes.  Iterative solvers generally use much less memory and scale better for larger problems but are problem specific. 

  Barry


> On Jul 18, 2024, at 6:23 PM, neil liu <liufield at gmail.com> wrote:
> 
> This Message Is From an External Sender
> This message came from outside your organization.
> Thanks, mark. 
> I am using a sparse matrix from a second-oder vector basis FEM. 
> Will this sparse matrix still use a memory similar to a dense matrix? 
> 
> Thanks,
> 
> Xiaodong 
> 
> 
> On Thu, Jul 18, 2024 at 6:00 PM Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
>> keep on the list.
>> 
>> 
>> 
>> 
>> On Thu, Jul 18, 2024 at 5:06 PM neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>> The matrix size (complex) is 611,432 x 611,432. 
>> 
>> A dense matrix of that size uses 3 terabytes to store 8*(611,432)^2 bytes.
>> 
>> That is too big.
>> 
>> Mark
>>  
>>> 
>>> On Thu, Jul 18, 2024 at 4:58 PM Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
>>>> How big is your matrix?
>>>> 
>>>> On Thu, Jul 18, 2024 at 4:53 PM neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>>>> This Message Is From an External Sender
>>>>> This message came from outside your organization.
>>>>>  
>>>>> Dear Pestc team,
>>>>> 
>>>>> I am trying to solve a complex linear system by Petsc KSP. When I committed out this piece code, no errors came out. Will my coding part affect ksp? I am using a direct solver, -pc_type LU. 
>>>>> 
>>>>> PetscErrorCode ElementInfo::solveLinearSystem( ){ 
>>>>> PetscFunctionBegin; 
>>>>> KSP ksp; 
>>>>> KSPCreate(PETSC_COMM_WORLD, &ksp); 
>>>>> KSPSetType(ksp, KSPFGMRES); 
>>>>> KSPSetOperators(ksp, A, A); 
>>>>> KSPSetFromOptions(ksp); KSPSolve(ksp, b, x); 
>>>>> KSPDestroy(&ksp); 
>>>>> PetscFunctionReturn(PETSC_SUCCESS);
>>>>>  }
>>>>> 
>>>>> The output with -malloc_test and -malloc_view is attached. It shows the following errors,
>>>>> Line 5  in the attached file
>>>>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>>>>> [0]PETSC ERROR: Out of memory. This could be due to allocating
>>>>> [0]PETSC ERROR: too large an object or bleeding by not properly
>>>>> [0]PETSC ERROR: destroying unneeded objects.
>>>>> [0] Maximum memory PetscMalloc()ed 19474398848 maximum size of entire process 740352000 (This only used 20% of my 64Gb memory .)
>>>>>  Line 111 in the attached file
>>>>> [0]PETSC ERROR: Memory requested 18446744069642786816  (This is too big.)
>>>>> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cW4LxgQlFsBzBuRLqMZqdc1Snpka_OgCdlsFdPhw-JYpAdIrHQq4j9v2koyQZhBYxB8OuX7c_s2R6gBnSElpzQs$  <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ccZ34UzQhUBbNR8XsOs-FZm2yIW4bzLaYbGGphZTOcDpU5XZNTG_3iagHL_C6Z_e84LvVq04MoQToMzMYfU72jU$> for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.21.1, unknown
>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named kirin.remcominc.com <https://urldefense.us/v3/__http://kirin.remcominc.com/__;!!G_uCfscf7eWS!ccZ34UzQhUBbNR8XsOs-FZm2yIW4bzLaYbGGphZTOcDpU5XZNTG_3iagHL_C6Z_e84LvVq04MoQToMzM1VQhXLw$> by xiaodong.liu Thu Jul 18 16:11:52 2024
>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --download-fblaslapack --download-mpich --with-scalar-type=complex --download-triangle
>>>>> [0]PETSC ERROR: #1 PetscMallocAlign() at /home/xiaod

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240718/2ab5922b/attachment-0001.html>


More information about the petsc-users mailing list