[petsc-users] MPI linear solver reproducibility question

Mark McClure mark at resfrac.com
Sun Apr 2 00:03:48 CDT 2023


In the typical FD implementation, you only set local rows, but with FE and
sometimes FV, you also create values that need to be communicated and
summed on other processors.
Makes sense.

Anyway, in this case, I am certain that I am giving the solver bitwise
identical matrices from each process. I am not using a preconditioner,
using BCGS, with Petsc version 3.13.3.

So then, how can I make sure that I am "using an MPI that follows the
suggestion for implementers about determinism"? I am using MPICH version
3.3a2, didn't do anything special when installing it. Does that sound OK?
If so, I could upgrade to the latest Petsc, try again, and if confirmed
that it persists, could provide a reproduction scenario.



On Sat, Apr 1, 2023 at 9:53 PM Jed Brown <jed at jedbrown.org> wrote:

> Mark McClure <mark at resfrac.com> writes:
>
> > Thank you, I will try BCGSL.
> >
> > And good to know that this is worth pursuing, and that it is possible.
> Step
> > 1, I guess I should upgrade to the latest release on Petsc.
> >
> > How can I make sure that I am "using an MPI that follows the suggestion
> for
> > implementers about determinism"? I am using MPICH version 3.3a2.
> >
> > I am pretty sure that I'm assembling the same matrix every time, but I'm
> > not sure how it would depend on 'how you do the communication'. Each
> > process is doing a series of MatSetValues with INSERT_VALUES,
> > assembling the matrix by rows. My understanding of this process is that
> > it'd be deterministic.
>
> In the typical FD implementation, you only set local rows, but with FE and
> sometimes FV, you also create values that need to be communicated and
> summed on other processors.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230401/ee6ccbe3/attachment.html>


More information about the petsc-users mailing list