[petsc-users] Stokes problem with DA and MUMPS

gouarin loic.gouarin at math.u-psud.fr
Fri Feb 18 03:22:48 CST 2011


Hi,

I want to solve 3D Stokes problem with 4Q1/Q1 finite element 
discretization. I have done a parallel version and it works for very 
small size. But it's very slow.

I use DA to create my grid. Here is an example

DACreate3d(PETSC_COMM_WORLD,DA_NONPERIODIC,DA_STENCIL_BOX,
                    nv, nv, nv, PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,
                    4,2,PETSC_NULL,PETSC_NULL,PETSC_NULL,&da);

The matrix problem has the form

[ A    B1]
[ B2   0 ]

and the preconditioner is

[ A  0]
[ 0  M]

I use fgmres to solve my system and I want to use MUMPS to solve the 
linear system Ax=b for the preconditioning step.
I want to solve multiple time this problem with different second member.

The first problem is when I call

   Mat A, B;
   DAGetMatrix(da, MATAIJ, &A);
   DAGetMatrix(da, MATAIJ, &B);

It takes a long time in 3D and I don't understand why. I keep the debug 
version of Petsc but i don't think that it is the problem.

After that MUMPS can't do the factorization for nv=33 in my grid 
definition (the case nv=19 works) because there is not enough memory. I 
have the error

[0]PETSC ERROR: Error reported by MUMPS in numerical factorization 
phase: Cannot allocate required memory 1083974153 megabytes

But my problem is not very big for the moment! I want to be able to 
solve Stokes problem on a grid 128x128x128 for the velocity field.

Here is my script to launch the program

mpiexec -np 1 ./stokesPart \
      -stokes_ksp_type fgmres \
      -stokes_ksp_rtol 1e-6 \
      -stokes_pc_type fieldsplit \
      -stokes_pc_fieldsplit_block_size 4 \
      -stokes_pc_fieldsplit_0_fields 0,1,2 \
      -stokes_pc_fieldsplit_1_fields 3 \
      -stokes_fieldsplit_0_ksp_type preonly \
      -stokes_fieldsplit_0_pc_type lu \
      -stokes_fieldsplit_0_pc_factor_mat_solver_package mumps\
      -stokes_fieldsplit_1_ksp_type preonly \
      -stokes_fieldsplit_1_pc_type jacobi \
      -stokes_ksp_monitor_short

I compile Petsc-3.1-p7 with the options

--with-mpi4py=1 --download-mpi4py=yes --with-petsc4py=1 
--download-petsc4py=yes --with-shared --with-dynamic --with-hypre=1 
--download-hypre=yes --with-ml=1 --download-ml=yes --with-mumps=1 
--download-mumps=yes --with-parmetis=1 --download-parmetis=yes 
--with-prometheus=1 --download-prometheus=yes --with-scalapack=1 
--download-scalapack=yes --with-blacs=1 --download-blacs=yes

I think I have to put some MUMPS options but I don't know exactly what.

Could you tell me what I do wrong?

Best Regards,

Loic
-------------- next part --------------
A non-text attachment was scrubbed...
Name: loic_gouarin.vcf
Type: text/x-vcard
Size: 551 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110218/bd9fc239/attachment.vcf>


More information about the petsc-users mailing list