memory usage of the unstructured grid

Zhifeng Sheng z.sheng at
Thu Nov 1 07:45:57 CDT 2007

Shi Jin wrote:

> Hi,
> I have observed that the memory usage of the petsc mesh is much higher 
> than my previous code, if both were to be run serially.
> For example, for a simple cubic box with 750,000 tetrahedral elements, 
> my old code takes about 200MB for the whole array, including all the 
> mappings  required for later use such as the inverse connectivity  
> table.  For the same mesh, my PETSc code takes about 4GB for the mesh 
> alone.
> The same can be found in the provided examples. I made a few changes 
> to the navierStokes code to output the virtual memory usage and got
> ./navierStokes -dim 3 -generate  -structured 0 -refinement_limit 1e-6
>     109,283 elements,  139,030 edges , 21,523 vertexes
>     [0]:after mesh created:mem=574.46 MB
> This is consistent with my Petsc code.
> I understand that for the mesh to scale in parallel, extra  
> information needs to be stored. But the current  cost seems too 
> expensive. I am wondering if there is a way to cut the memory usage 
> for the mesh.
> Thank you very much.
> Shi
> __________________________________________________
> Do You Yahoo!?
> Tired of spam? Yahoo! Mail has the best spam protection around


I am not expert on it... but I have used some FEM library beside mine, 
and I found out that some libraries create neighboring element, edge, 
and node list for every element. This is not necessary if you do use 
them. Similar thing could have happened in your case.


More information about the petsc-users mailing list