[petsc-users] Question about memory usage in Multigrid preconditioner

frank hengjiew at uci.edu
Tue Jul 5 17:23:55 CDT 2016


Hi,

I am using the CG ksp solver and Multigrid preconditioner  to solve a 
linear system in parallel.
I chose to use the 'Telescope' as the preconditioner on the coarse mesh 
for its good performance.
The petsc options file is attached.

The domain is a 3d box.
It works well when the grid is  1536*128*384 and the process mesh is 
96*8*24. When I double the size of grid and keep the same process mesh 
and petsc options, I get an "out of memory" error from the super-cluster 
I am using.
Each process has access to at least 8G memory, which should be more than 
enough for my application. I am sure that all the other parts of my 
code( except the linear solver ) do not use much memory. So I doubt if 
there is something wrong with the linear solver.
The error occurs before the linear system is completely solved so I 
don't have the info from ksp view. I am not able to re-produce the error 
with a smaller problem either.
In addition,  I tried to use the block jacobi as the preconditioner with 
the same grid and same decomposition. The linear solver runs extremely 
slow but there is no memory error.

How can I diagnose what exactly cause the error?
Thank you so much.

Frank
-------------- next part --------------
-ksp_type        cg 
-ksp_norm_type   unpreconditioned
-ksp_lag_norm
-ksp_rtol        1e-7
-ksp_initial_guess_nonzero  yes
-ksp_converged_reason 
-ppe_max_iter 50
-pc_type mg
-pc_mg_galerkin
-pc_mg_levels 4
-mg_levels_ksp_type richardson 
-mg_levels_ksp_max_it 1
-mg_coarse_ksp_type preonly
-mg_coarse_pc_type telescope
-mg_coarse_pc_telescope_reduction_factor 64
-options_left
-log_summary

# Setting dmdarepart on subcomm
-repart_da_processors_x 24
-repart_da_processors_y 2
-repart_da_processors_z 6
-mg_coarse_telescope_ksp_type preonly
#-mg_coarse_telescope_ksp_constant_null_space
-mg_coarse_telescope_pc_type mg
-mg_coarse_telescope_pc_mg_galerkin
-mg_coarse_telescope_pc_mg_levels 4
-mg_coarse_telescope_mg_levels_ksp_max_it 1
-mg_coarse_telescope_mg_levels_ksp_type richardson
-mg_coarse_telescope_mg_coarse_ksp_type preonly
-mg_coarse_telescope_mg_coarse_pc_type svd
#-mg_coarse_telescope_mg_coarse_pc_type telescope
#-mg_coarse_telescope_mg_coarse_pc_telescope_reduction_factor 64

# Second subcomm
#-mg_coarse_telescope_mg_coarse_telescope_ksp_type preonly
#-mg_coarse_telescope_mg_coarse_telescope_pc_type mg
#-mg_coarse_telescope_mg_coarse_telescope_pc_mg_galerkin
#-mg_coarse_telescope_mg_coarse_telescope_pc_mg_levels 3
#-mg_coarse_telescope_mg_coarse_telescope_mg_levels_ksp_type richardson
#-mg_coarse_telescope_mg_coarse_telescope_mg_levels_ksp_max_it 1
#-mg_coarse_telescope_mg_coarse_telescope_mg_coarse_ksp_type richardson
#-mg_coarse_telescope_mg_coarse_telescope_mg_coarse_pc_type svd


More information about the petsc-users mailing list