[petsc-users] Suggestions for code with dof >> 1 ?

Christophe Ortiz christophe.ortiz at ciemat.es
Thu Oct 31 04:42:24 CDT 2013


Hi guys,

Since I found out that with TSSetEquationType things work better I have
made some tests with a system of 2 PDEs (diff equa) and 1 ODE, with
different boundary conditions (Dirichlet and Neumann). I played around with
extreme values of the Fields and diffusion coefficients and the conclusion
is that now it works well; and I think the code is ready for the
implementation of dof >> 1.

So far I made the tests with 1 proc but with doff >> 1 I guess it will
become very time consuming and that it would be better to use several
procs. Therefore, I have few questions:

- Basically, what should I change in my code to prepare it for several
procs ?

- Should I change the matrix type ? In the current version of the code I
declared the matrix with:
  ierr = DMCreateMatrix(da,MATAIJ,&J);
Is it ok ?

- I think I should define the pattern for diagonal and offdiag blocks
  ierr = DMDASetBlockFills(da,dfill,ofill);
It's already done but I understand it only makes sense when using MPIAIJ
matrix format.

- Regarding the solver of the system, what should I change ? For the moment
I use the defaults: KSPGMRES and PCILU.

- To understand what I'm doing, when using several procs, how is actually
solved the system ? Is the whole system solved in parallel or only
IFunction and IJacobian are evaluated in parallel ?

Thanks in advance for your help.
Christophe



CIEMAT
Laboratorio Nacional de Fusión por Confinamiento Magnético
Unidad de Materiales
Edificio 2 - Planta 0 - Despacho 28m
Avenida Complutense 40,
28040 Madrid, Spain
Tel: +34 91496 2582
Fax: +34 91346 6442

--
Q
Por favor, piense en el medio ambiente antes de imprimir este mensaje.
Please consider the environment before printing this email.


On Thu, Oct 24, 2013 at 12:09 PM, Christophe Ortiz <
christophe.ortiz at ciemat.es> wrote:

>
> On Wed, Oct 23, 2013 at 7:14 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> Christophe Ortiz <christophe.ortiz at ciemat.es> writes:
>>
>> >> That does not sound right.  Can you send a reproducible test case (best
>> >> as a patch to PETSc)?
>> >>
>> >
>> > What do you mean by patch to PETSc ? How do I do that ?
>>
>> See "contributing a small patch" (or any of the other workflows).
>>
>> https://bitbucket.org/petsc/petsc/wiki/Home
>
>
> Ok, I will read the instructions and try.
>
>
>>
>>
>> > Sorry about that. Here is the entire error message:
>> >
>> > [0]PETSC ERROR: No support for this operation for this object type!
>> > [0]PETSC ERROR: Mat type mffd!
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
>> > [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> > [0]PETSC ERROR: See docs/index.html for manual pages.
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: ./diffusion_c on a ifort-icc-nompi-double-debug named
>> > mazinger.ciemat.es by u5751 Wed Oct 23 12:46:04 2013
>> > [0]PETSC ERROR: Libraries linked from
>> > /home/u5751/petsc-3.4.1/ifort-icc-nompi-double-debug/lib
>> > [0]PETSC ERROR: Configure run at Mon Oct 14 05:43:50 2013
>> > [0]PETSC ERROR: Configure options --with-mpi=0 --with-fc=ifort
>> > --with-cc=icc --with-debugging=1 --with-scalar-type=real
>> > --with-precision=double --with-blas-lapack-dir=/opt/intel/mkl
>> > [0]PETSC ERROR:
>> > ------------------------------------------------------------------------
>> > [0]PETSC ERROR: MatZeroEntries() line 5189 in src/mat/interface/matrix.c
>> > [0]PETSC ERROR: FormIJacobian() line 689 in
>> > src/ts/examples/tutorials/diffusion.c
>>
>> This is your code calling MatZeroEntries on the Amat (first Mat
>> argument).  The convention is to assemble into Pmat (the preconditioning
>> matrix; second Mat argument).
>>
>
> I assemble first Pmat but in my code MatZeroEntries was used with Amat,
> which PETSc did not appreciate...
> I changed that and here is the output. While without these options it
> converges--though giving solution with oscillations--now with
> -ts_arkimex_fully_implicit -snes_mf_operator it does not converge at all.
> No step is accepted, not even the first one:
>
>       TSAdapt 'basic': step   0 stage rejected t=0          + 1.000e-12
> retrying with dt=2.500e-13
>       TSAdapt 'basic': step   0 stage rejected t=0          + 2.500e-13
> retrying with dt=6.250e-14
>       TSAdapt 'basic': step   0 stage rejected t=0          + 6.250e-14
> retrying with dt=1.562e-14
>       TSAdapt 'basic': step   0 stage rejected t=0          + 1.562e-14
> retrying with dt=3.906e-15
>
> ...  ....
>       TSAdapt 'basic': step   0 stage rejected t=0          +2.888e-287
> retrying with dt=7.221e-288
>       TSAdapt 'basic': step   0 stage rejected t=0          +7.221e-288
> retrying with dt=1.805e-288
>       TSAdapt 'basic': step   0 stage rejected t=0          +1.805e-288
> retrying with dt=4.513e-289
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Floating point exception!
> [0]PETSC ERROR: Vec entry at local location 1103 is not-a-number or
> infinite at end of function: Parameter number 3!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: ./diffusion_c on a ifort-icc-nompi-double-debug named
> mazinger.ciemat.es by u5751 Thu Oct 24 03:37:56 2013
> [0]PETSC ERROR: Libraries linked from
> /home/u5751/petsc-3.4.1/ifort-icc-nompi-double-debug/lib
> [0]PETSC ERROR: Configure run at Mon Oct 14 05:43:50 2013
> [0]PETSC ERROR: Configure options --with-mpi=0 --with-fc=ifort
> --with-cc=icc --with-debugging=1 --with-scalar-type=real
> --with-precision=double --with-blas-lapack-dir=/opt/intel/mkl
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: VecValidValues() line 30 in src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: SNESComputeFunction() line 1998 in
> src/snes/interface/snes.c
> [0]PETSC ERROR: SNESSolve_NEWTONLS() line 162 in src/snes/impls/ls/ls.c
> [0]PETSC ERROR: SNESSolve() line 3636 in src/snes/interface/snes.c
> [0]PETSC ERROR: TSStep_ARKIMEX() line 765 in src/ts/impls/arkimex/arkimex.c
> [0]PETSC ERROR: TSStep() line 2458 in src/ts/interface/ts.c
> [0]PETSC ERROR: TSSolve() line 2583 in src/ts/interface/ts.c
> [0]PETSC ERROR: main() line 457 in src/ts/examples/tutorials/diffusion.c
> ./compile_diffusion: line 17: 23584 Aborted                 ./diffusion_c
> -ts_adapt_monitor -ts_adapt_basic_clip 0.1,1.1 -draw_pause -2
> -ts_arkimex_type 1bee -ts_max_snes_failures -1 -snes_type newtonls
> -snes_linesearch_type bt -ts_arkimex_fully_implicit -snes_mf_operator
>
> Using -snes_converged_reason and -ksp_converged_reason gives:
>
>     Linear solve did not converge due to DIVERGED_DTOL iterations 30
>   Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations
> 0
>       TSAdapt 'basic': step   0 stage rejected t=0          + 1.000e-12
> retrying with dt=2.500e-13
>
> Thus, I increased the maximum number of iterations of the KSP with
>   ierr = SNESSetMaxLinearSolveFailures(snes,100);
> and
>   ierr =
> KSPSetTolerances(ksp,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT,1000);
>
> Then it converges, steps are accepted. However, the solution obtained is
> weird, as if there was no diffusion. Moreover, values are incorrect. I
> start with a gaussian peak at 1e19 and end up with a gaussian with a peak
> at 1e25.
>
> BTW, why using -ts_arkimex_fully_implicit if all the terms are in the F
> (LHS) ? You said that by definition there is no explicit stage in that case.
>
> However, I could obtain a good solution by adding mesh points (without
> fully implicit and mf_operator). Maybe it was due to large gradients and
> not enough points.
>
> Christophe
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131031/f31125b8/attachment.html>


More information about the petsc-users mailing list