[petsc-users] Stokes problem with DA and MUMPS

gouarin loic.gouarin at math.u-psud.fr
Fri Feb 18 04:45:08 CST 2011


Hi Dave,

thanks for your quick reply.

On 18/02/2011 10:49, Dave May wrote:
> Hey Loic,
>    I think the problem is clear from the error message.
> You simply don't have enough memory to perform the LU decomposition.
>
> > From the info you provide  I can see a numerous places here your code
> is using significant amounts of memory (without even considering the
> LU factorisation).
>
> 1) The DA you create uses a stencil width of 2. This is actually not
> required for your element type. Stencil width of one is sufficient.
>
> 2) DAGetMatrix is allocating memory of the (2,2) block (the zero
> matrix) in your stokes system.
>
> 3) The precondioner matrix B created with DAGetMatrix is allocating
> memory of the off-diagonal blocks (1,2) and (2,1) which you don't use.
>
> 4) Fieldsplit (additive) is copying the matrices associated with the
> (1,1) and (2,2) block from the preconditioner.
>
>
> Representing this complete element type of the DA is not a good idea,
> however representing the velocity space and pressure space on
> different DA's is fine. Doing this would allow a stencil width of one
> to be used for both velocity and pressure - which is that is required.
> You can connect the two DA's for velocity and pressure via
> DMComposite. Unfortunately, DMComposite cannot create and preallocate
> the off-diagonal matrices for you, but it can create and preallocate
> memory for the diagonal blocks. You would have to provide the
> preallocation routine for the off-diagonal blocks.
>
Ok. It's the first time that I use Petsc for a big problem and I don't 
yet see pretty well all the possibilities of Petsc.
When you say: "You would have to provide the preallocation routine for 
the off-diagonal block", do you talk about DASetBlockFills or is it more 
complicated ?
> I would recommend switching to petsc-dev as there is much more support
> for this type of "multi-physics" coupling.
>
I see that DMDA is more advanced in petsc-dev. I'll try it.
> I doubt you will ever be able to solve your 128^3 problem using MUMPS
> to factor your (1,1) block. The memory required is simply to great,
> you will have to consider using a multilevel preconditioner.
>
> Can you solve your problem using ML or BoomerAMG?
>
I tried different solvers. ML doesn't work. I used hypre with this script

mpiexec -np 4 ./stokesPart \
      -stokes_ksp_type minres \
      -stokes_ksp_rtol 1e-6 \
      -stokes_pc_type fieldsplit \
      -stokes_pc_fieldsplit_block_size 4 \
      -stokes_pc_fieldsplit_type SYMMETRIC_MULTIPLICATIVE \
      -stokes_pc_fieldsplit_0_fields 0,1,2 \
      -stokes_pc_fieldsplit_1_fields 3 \
      -stokes_fieldsplit_0_ksp_type richardson \
      -stokes_fieldsplit_0_ksp_max_it 1 \
      -stokes_fieldsplit_0_pc_type hypre \
      -stokes_fieldsplit_0_pc_hypre_type boomeramg\
      -stokes_fieldsplit_0_pc_hypre_boomeramg_max_iter 1 \
      -stokes_fieldsplit_1_ksp_type preonly \
      -stokes_fieldsplit_1_pc_type jacobi \
      -stokes_ksp_monitor_short

but once again, it's very slow or out of memory. Perhaps my options are 
not good ...

The problem is that I don't have to solve Stokes problem just one time 
but multiple time so I have to do this as fast as possible.

Thanks again,
Loic

> Cheers,
>    Dave
>
>
> On 18 February 2011 10:22, gouarin<loic.gouarin at math.u-psud.fr>  wrote:
>> Hi,
>>
>> I want to solve 3D Stokes problem with 4Q1/Q1 finite element discretization.
>> I have done a parallel version and it works for very small size. But it's
>> very slow.
>>
>> I use DA to create my grid. Here is an example
>>
>> DACreate3d(PETSC_COMM_WORLD,DA_NONPERIODIC,DA_STENCIL_BOX,
>>                    nv, nv, nv, PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,
>>                    4,2,PETSC_NULL,PETSC_NULL,PETSC_NULL,&da);
>>
>> The matrix problem has the form
>>
>> [ A    B1]
>> [ B2   0 ]
>>
>> and the preconditioner is
>>
>> [ A  0]
>> [ 0  M]
>>
>> I use fgmres to solve my system and I want to use MUMPS to solve the linear
>> system Ax=b for the preconditioning step.
>> I want to solve multiple time this problem with different second member.
>>
>> The first problem is when I call
>>
>>   Mat A, B;
>>   DAGetMatrix(da, MATAIJ,&A);
>>   DAGetMatrix(da, MATAIJ,&B);
>>
>> It takes a long time in 3D and I don't understand why. I keep the debug
>> version of Petsc but i don't think that it is the problem.
>>
>> After that MUMPS can't do the factorization for nv=33 in my grid definition
>> (the case nv=19 works) because there is not enough memory. I have the error
>>
>> [0]PETSC ERROR: Error reported by MUMPS in numerical factorization phase:
>> Cannot allocate required memory 1083974153 megabytes
>>
>> But my problem is not very big for the moment! I want to be able to solve
>> Stokes problem on a grid 128x128x128 for the velocity field.
>>
>> Here is my script to launch the program
>>
>> mpiexec -np 1 ./stokesPart \
>>      -stokes_ksp_type fgmres \
>>      -stokes_ksp_rtol 1e-6 \
>>      -stokes_pc_type fieldsplit \
>>      -stokes_pc_fieldsplit_block_size 4 \
>>      -stokes_pc_fieldsplit_0_fields 0,1,2 \
>>      -stokes_pc_fieldsplit_1_fields 3 \
>>      -stokes_fieldsplit_0_ksp_type preonly \
>>      -stokes_fieldsplit_0_pc_type lu \
>>      -stokes_fieldsplit_0_pc_factor_mat_solver_package mumps\
>>      -stokes_fieldsplit_1_ksp_type preonly \
>>      -stokes_fieldsplit_1_pc_type jacobi \
>>      -stokes_ksp_monitor_short
>>
>> I compile Petsc-3.1-p7 with the options
>>
>> --with-mpi4py=1 --download-mpi4py=yes --with-petsc4py=1
>> --download-petsc4py=yes --with-shared --with-dynamic --with-hypre=1
>> --download-hypre=yes --with-ml=1 --download-ml=yes --with-mumps=1
>> --download-mumps=yes --with-parmetis=1 --download-parmetis=yes
>> --with-prometheus=1 --download-prometheus=yes --with-scalapack=1
>> --download-scalapack=yes --with-blacs=1 --download-blacs=yes
>>
>> I think I have to put some MUMPS options but I don't know exactly what.
>>
>> Could you tell me what I do wrong?
>>
>> Best Regards,
>>
>> Loic
>>


-- 
Loic Gouarin
Laboratoire de Mathématiques
Université Paris-Sud
Bâtiment 425
91405 Orsay Cedex
France
Tel: (+33) 1 69 15 60 14
Fax: (+33) 1 69 15 67 18



More information about the petsc-users mailing list