[petsc-users] Non deterministic results with MUMPS?

Stefano Zampini stefano.zampini at gmail.com
Tue Mar 13 12:17:49 CDT 2018


This is expected. In parallel, you cannot assume the order of operations is
preserved

Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" <kandanovian at gmail.com> ha scritto:

> Hi all,
>
> I get some randomness when solving certain equation systems with MUMPS.
> When I repeatedly solve the attached equation system by ksp example
> 10, I get different solution vectors and therefore different residual
> norms.
>
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.15502e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.15502e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.17364e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.17364e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.17364e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.15502e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.15502e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.17364e-12
> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
> -pc_type lu -pc_factor_mat_solver_package mumps
> Number of iterations =   1
> Residual norm 4.15502e-12
>
> It seems to be depending on a combination of number of processes and
> the equation system.
> I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without
> metis/parmetis), openMPI 2.1.2. All with the same results.
> PETSc configuration is the current maint branch:
> ./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"
> --CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack
>
> Using "--download-fblaslapack --download-scalapack" didnt make a
> difference neither.
> Can anyone reproduce that issue?
>
> Thanks and kind regards,
>
> Volker
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180314/23ef45c5/attachment.html>


More information about the petsc-users mailing list