Parallel direct solvers with PFLOTRAN

Barry Smith bsmith at mcs.anl.gov
Wed Sep 17 09:51:24 CDT 2008


    Given that constructing the LU factorization will be time- 
consuming you may want to also try lagging the computation of the LU,  
for example,
to have the LU done once for an entire Newton solve you can use
-flow_ksp_type ibcgs  (or bcgs) -flow_snes_lag_preconditioner 100  
(some large number here)

To keep the LU factors the same for several timesteps you can call  
SNESSetLagPreconditioner(snes,-1,ierr) Then you need to call with
SNESSetLagPreconditioner(snes,largenumber,ierr) to trigger a recompute  
and then before the next SNESSolve call with -1 to stop the
recomputation.


    Barry

I realize I need to add a SNESSetLagPreconditioner(snes,-2,ierr) to  
trigger a single recomputation of the LU. I will do this as soon as I  
can.


On Sep 16, 2008, at 11:00 PM, Richard Tran Mills wrote:

> Peter and Glenn,
>
> You had asked about using parallel direct solvers with PFLOTRAN.  If  
> you install a package such as mumps or superlu_dist with the  
> corresponding PETSc interfaces, you can easily try using a direct  
> solver using command line options such as the following:
>
> -flow_mat_type mpiaij -flow_ksp_type preonly -flow_pc_type lu - 
> flow_pc_factor_mat_solver_package mumps
>
> You'll need to pull and update to include the push I made tonight  
> that allows you to specify the matrix type on the command line.   
> (Things default to BAIJ, which isn't supported for the direct  
> solvers.)
>
> The above looks pretty messy but we can worry about making a cleaner  
> way to do this if this turns out to be something that you might want  
> to do on a routine basis.
>
> I have tested PFLOTRAN with both MUMPS and SuperLU_dist on a Linux  
> workstation using the very simple 'TAO/100_10_10' problem in the  
> examples directory.  This works fine.  I was having some problems  
> getting things to work properly on Jaguar (no surprise) and wasn't  
> quite able to get things resolved before the machine went down for  
> most of today.  I will try again tomorrow.
>
> In your own experiments, I would start with trying out MUMPS, as  
> anecdotal evidence suggests that it is fairly robust.
>
> --Richard
>




More information about the petsc-dev mailing list