[petsc-dev] help for use PETSc

tuane tuane at lncc.br
Tue Jun 28 11:55:56 CDT 2011


Dear collegues,

I and my professor need some help to better use PETSc.

We are working in a petroleum reservoir numerical simulation modeling 
for science.

We have experience on parallel computing with MPI and Petsc seems a 
good tool for us.

We need solve a symmetric linear algebraic equation system from finite 
element method  that calculates velocities of fluid displacements in a 
homogeneous and heterogeneous bi-dimensional porous media .

We are using PETSc to reduce running time and simulate larger models 
and trying some options to  solvers and precontioners.  We are not sure 
about our procedures.

Below are the lines that we used for some experiments.

     * -tol 1.0e-10 -ksp_type cg  -pc_type jacobi -ksp_left_pc


200x100 mesh homogeneous is OK with NP>=1
2000x500 mesh heterogeneous get convergence with a lot of iterations.

     * -tol 1.0e-10 -ksp_type cg  -pc_type redundant -ksp_left_pc


200x100 mesh homogeneous is OK
2000x500 mesh heterogeneous get convergence with some iterations, but 
the time response is better when use only one processor.

cg+redundant-PC give us the best result, similar to out original direct 
solver. we don't be sure what “redundant” is.

In some cases, we think that the previous solution is not being taken 
with the Jacobi preconditioner, even using the routine:
call KSPSetInitialGuessNonzero (ksp, PETSC_TRUE, ierr)

Our response are beeing too dependent of these parameters:

    1. iterative solver (CG, GMRES, BCGS)
    2. preconditioners (jacobi, redundant, ILU)
    3. tolerance
    4. number of processors



We also tried to use the Algebraic Multigrid, but without success. An 
error occurs in the program execution.  We used the routines:
call PCSetType (pc, PCMG, ierr)
call PCMGSetLevels(pc,levels,PETSC_NULL, ierr)
call PCMGSetType(pc, PC_MG_MULTIPLICATIVE, ierr)


Would you have any suggestions that can help us?

Could you help us?


Tuane Lopes



More information about the petsc-dev mailing list