[petsc-users] Stokes problem with DA and MUMPS

Dave May dave.mayhem23 at gmail.com
Fri Feb 18 03:49:09 CST 2011


Hey Loic,
  I think the problem is clear from the error message.
You simply don't have enough memory to perform the LU decomposition.

>From the info you provide  I can see a numerous places here your code
is using significant amounts of memory (without even considering the
LU factorisation).

1) The DA you create uses a stencil width of 2. This is actually not
required for your element type. Stencil width of one is sufficient.

2) DAGetMatrix is allocating memory of the (2,2) block (the zero
matrix) in your stokes system.

3) The precondioner matrix B created with DAGetMatrix is allocating
memory of the off-diagonal blocks (1,2) and (2,1) which you don't use.

4) Fieldsplit (additive) is copying the matrices associated with the
(1,1) and (2,2) block from the preconditioner.


Representing this complete element type of the DA is not a good idea,
however representing the velocity space and pressure space on
different DA's is fine. Doing this would allow a stencil width of one
to be used for both velocity and pressure - which is that is required.
You can connect the two DA's for velocity and pressure via
DMComposite. Unfortunately, DMComposite cannot create and preallocate
the off-diagonal matrices for you, but it can create and preallocate
memory for the diagonal blocks. You would have to provide the
preallocation routine for the off-diagonal blocks.

I would recommend switching to petsc-dev as there is much more support
for this type of "multi-physics" coupling.

I doubt you will ever be able to solve your 128^3 problem using MUMPS
to factor your (1,1) block. The memory required is simply to great,
you will have to consider using a multilevel preconditioner.

Can you solve your problem using ML or BoomerAMG?


Cheers,
  Dave


On 18 February 2011 10:22, gouarin <loic.gouarin at math.u-psud.fr> wrote:
> Hi,
>
> I want to solve 3D Stokes problem with 4Q1/Q1 finite element discretization.
> I have done a parallel version and it works for very small size. But it's
> very slow.
>
> I use DA to create my grid. Here is an example
>
> DACreate3d(PETSC_COMM_WORLD,DA_NONPERIODIC,DA_STENCIL_BOX,
>                   nv, nv, nv, PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,
>                   4,2,PETSC_NULL,PETSC_NULL,PETSC_NULL,&da);
>
> The matrix problem has the form
>
> [ A    B1]
> [ B2   0 ]
>
> and the preconditioner is
>
> [ A  0]
> [ 0  M]
>
> I use fgmres to solve my system and I want to use MUMPS to solve the linear
> system Ax=b for the preconditioning step.
> I want to solve multiple time this problem with different second member.
>
> The first problem is when I call
>
>  Mat A, B;
>  DAGetMatrix(da, MATAIJ, &A);
>  DAGetMatrix(da, MATAIJ, &B);
>
> It takes a long time in 3D and I don't understand why. I keep the debug
> version of Petsc but i don't think that it is the problem.
>
> After that MUMPS can't do the factorization for nv=33 in my grid definition
> (the case nv=19 works) because there is not enough memory. I have the error
>
> [0]PETSC ERROR: Error reported by MUMPS in numerical factorization phase:
> Cannot allocate required memory 1083974153 megabytes
>
> But my problem is not very big for the moment! I want to be able to solve
> Stokes problem on a grid 128x128x128 for the velocity field.
>
> Here is my script to launch the program
>
> mpiexec -np 1 ./stokesPart \
>     -stokes_ksp_type fgmres \
>     -stokes_ksp_rtol 1e-6 \
>     -stokes_pc_type fieldsplit \
>     -stokes_pc_fieldsplit_block_size 4 \
>     -stokes_pc_fieldsplit_0_fields 0,1,2 \
>     -stokes_pc_fieldsplit_1_fields 3 \
>     -stokes_fieldsplit_0_ksp_type preonly \
>     -stokes_fieldsplit_0_pc_type lu \
>     -stokes_fieldsplit_0_pc_factor_mat_solver_package mumps\
>     -stokes_fieldsplit_1_ksp_type preonly \
>     -stokes_fieldsplit_1_pc_type jacobi \
>     -stokes_ksp_monitor_short
>
> I compile Petsc-3.1-p7 with the options
>
> --with-mpi4py=1 --download-mpi4py=yes --with-petsc4py=1
> --download-petsc4py=yes --with-shared --with-dynamic --with-hypre=1
> --download-hypre=yes --with-ml=1 --download-ml=yes --with-mumps=1
> --download-mumps=yes --with-parmetis=1 --download-parmetis=yes
> --with-prometheus=1 --download-prometheus=yes --with-scalapack=1
> --download-scalapack=yes --with-blacs=1 --download-blacs=yes
>
> I think I have to put some MUMPS options but I don't know exactly what.
>
> Could you tell me what I do wrong?
>
> Best Regards,
>
> Loic
>


More information about the petsc-users mailing list