[petsc-users] Suggestions for code with dof >> 1 ?

Jed Brown jedbrown at mcs.anl.gov
Thu Oct 31 08:38:24 CDT 2013

Christophe Ortiz <christophe.ortiz at ciemat.es> writes:

> Hi guys,
> Since I found out that with TSSetEquationType things work better I have
> made some tests with a system of 2 PDEs (diff equa) and 1 ODE, with
> different boundary conditions (Dirichlet and Neumann). I played around with
> extreme values of the Fields and diffusion coefficients and the conclusion
> is that now it works well; and I think the code is ready for the
> implementation of dof >> 1.


> So far I made the tests with 1 proc but with doff >> 1 I guess it will
> become very time consuming and that it would be better to use several
> procs. Therefore, I have few questions:
> - Basically, what should I change in my code to prepare it for several
> procs ?

Just use local indexing.  The example you copied from works in parallel,
so your code might already be correct.

> - Should I change the matrix type ? In the current version of the code I
> declared the matrix with:
>   ierr = DMCreateMatrix(da,MATAIJ,&J);
> Is it ok ?

Yes, you can also pass NULL for the matrix type and the DM will choose
the type (which will be AIJ in this case).

> - I think I should define the pattern for diagonal and offdiag blocks
>   ierr = DMDASetBlockFills(da,dfill,ofill);
> It's already done but I understand it only makes sense when using MPIAIJ
> matrix format.

Yeah, and it only makes sense if the blocks are significantly sparse.
That is a less important optimization.

> - Regarding the solver of the system, what should I change ? For the moment
> I use the defaults: KSPGMRES and PCILU.

The default preconditioner in parallel is Block Jacobi with ILU.  Start
with the default and understand how it scales as you add processes, then
if it's not satisfactory look at other methods (like -pc_type gamg).

> - To understand what I'm doing, when using several procs, how is actually
> solved the system ? Is the whole system solved in parallel or only
> IFunction and IJacobian are evaluated in parallel ?

Everything is solved in parallel.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131031/bb59ea2c/attachment.pgp>

More information about the petsc-users mailing list