[petsc-users] Non deterministic results with MUMPS?
Tim Steinhoff
kandanovian at gmail.com
Tue Mar 13 13:10:11 CDT 2018
Thanks for your fast reply.
I see that I can't expect the same results when changing the number of
processes, but how does MPI change the order of operations, when there
are for example 2 processes and the partitioning is fixed?
With GMRES I could not prorduce that behavior, no matter how many processes.
2018-03-13 18:17 GMT+01:00 Stefano Zampini <stefano.zampini at gmail.com>:
> This is expected. In parallel, you cannot assume the order of operations is
> preserved
>
> Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" <kandanovian at gmail.com> ha scritto:
>>
>> Hi all,
>>
>> I get some randomness when solving certain equation systems with MUMPS.
>> When I repeatedly solve the attached equation system by ksp example
>> 10, I get different solution vectors and therefore different residual
>> norms.
>>
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.15502e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.15502e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.17364e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.17364e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.17364e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.15502e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.15502e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.17364e-12
>> jac at jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>> -pc_type lu -pc_factor_mat_solver_package mumps
>> Number of iterations = 1
>> Residual norm 4.15502e-12
>>
>> It seems to be depending on a combination of number of processes and
>> the equation system.
>> I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without
>> metis/parmetis), openMPI 2.1.2. All with the same results.
>> PETSc configuration is the current maint branch:
>> ./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"
>> --CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack
>>
>> Using "--download-fblaslapack --download-scalapack" didnt make a
>> difference neither.
>> Can anyone reproduce that issue?
>>
>> Thanks and kind regards,
>>
>> Volker
More information about the petsc-users
mailing list