[petsc-users] Non deterministic results with MUMPS?

Tim Steinhoff kandanovian at gmail.com
Tue Mar 13 12:14:53 CDT 2018


Hi all,

I get some randomness when solving certain equation systems with MUMPS.
When I repeatedly solve the attached equation system by ksp example
10, I get different solution vectors and therefore different residual
norms.

jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12

It seems to be depending on a combination of number of processes and
the equation system.
I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without
metis/parmetis), openMPI 2.1.2. All with the same results.
PETSc configuration is the current maint branch:
./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"
--CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack

Using "--download-fblaslapack --download-scalapack" didnt make a
difference neither.
Can anyone reproduce that issue?

Thanks and kind regards,

Volker
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mumps-eqs.tar.gz
Type: application/x-gzip
Size: 14140 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180313/424403c6/attachment.gz>


More information about the petsc-users mailing list