[petsc-users] Estimate memory needs for large grids

Jed Brown jed at jedbrown.org
Fri Apr 5 22:52:55 CDT 2019

Memory use will depend on the preconditioner.  This will converge very
slowly (i.e., never) without multigrid unless time steps are small.
Depending on how rough the coefficients are, you may be able to use
geometric multigrid, which has pretty low setup costs and memory

To estimate memory with an arbitrary preconditioner, I would run a
smaller problem using the desired preconditioner and check its memory
use using -log_view.  From that you can estimate total memory
requirements for the target job.

Sajid Ali via petsc-users <petsc-users at mcs.anl.gov> writes:

> Hi,
> I've solving a simple linear equation [ u_t = A*u_xx + A*u_yy + F_t*u ] on
> a grid size of 55296x55296. I'm reading a vector of that size from an hdf5
> file and have the jacobian matrix as a modified 5-point stencil which is
> preallocated with the following
> ```
>   ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);
>   ierr = MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,M,M);CHKERRQ(ierr);
>   ierr = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);
>   ierr = MatSetFromOptions(A);CHKERRQ(ierr);
>   ierr = MatMPIAIJSetPreallocation(A,5,NULL,5,NULL);CHKERRQ(ierr);
>   ierr = MatSeqAIJSetPreallocation(A,5,NULL);CHKERRQ(ierr);
> ```
> Total number of elements is ~3e9 and the matrix size is ~9e9 (but only 5
> diagonals are non zeros). I'm reading F_t which has ~3e9 elements. I'm
> using double complex numbers and I've compiled with int64 indices.
> Thus, for the vector I need, 55296x55296x2x8 bytes ~ 50Gb and for the F
> vector, another 50 Gb. For the matrix I need ~250 Gb and some overhead for
> the solver.
> How do I estimate this overhead (and estimate how many nodes I would need
> to run this given the maximum memory per node (as specified by slurm's
> --mem option)) ?
> Thanks in advance for the help!
> -- 
> Sajid Ali
> Applied Physics
> Northwestern University

More information about the petsc-users mailing list