<div dir="auto">This is expected. In parallel, you cannot assume the order of operations is preserved</div><div class="gmail_extra"><br><div class="gmail_quote">Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" <<a href="mailto:kandanovian@gmail.com">kandanovian@gmail.com</a>> ha scritto:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi all,<br>
<br>
I get some randomness when solving certain equation systems with MUMPS.<br>
When I repeatedly solve the attached equation system by ksp example<br>
10, I get different solution vectors and therefore different residual<br>
norms.<br>
<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.15502e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.15502e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.17364e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.17364e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.17364e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.15502e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.15502e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.17364e-12<br>
jac@jac-VirtualBox:~/data/rep/<wbr>petsc/src/ksp/ksp/examples/<wbr>tutorials$<br>
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly<br>
-pc_type lu -pc_factor_mat_solver_package mumps<br>
Number of iterations = 1<br>
Residual norm 4.15502e-12<br>
<br>
It seems to be depending on a combination of number of processes and<br>
the equation system.<br>
I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without<br>
metis/parmetis), openMPI 2.1.2. All with the same results.<br>
PETSc configuration is the current maint branch:<br>
./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"<br>
--CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack<br>
<br>
Using "--download-fblaslapack --download-scalapack" didnt make a<br>
difference neither.<br>
Can anyone reproduce that issue?<br>
<br>
Thanks and kind regards,<br>
<br>
Volker<br>
</blockquote></div></div>