[petsc-users] petsc compiled without MPI

Long, Jianbo jl7037 at mun.ca
Mon Feb 27 07:41:19 CST 2023


Thanks for the explanations ! It turns out the issue of running
sequentially compiled petsc is PetscFinalize() function. Since my
subroutine involving petsc functions needs to be called multiple times in
the program, I have to comment out PetscFinalize() at the end of the
subroutine, otherwise at the next call of this subroutine, petsc would stop
and throw out an error about MPI_Comm_set_errhandler !

Jianbo

On Sun, Feb 26, 2023 at 4:39 PM Satish Balay <balay at mcs.anl.gov> wrote:

> On Sun, 26 Feb 2023, Pierre Jolivet wrote:
>
> >
> >
> > > On 25 Feb 2023, at 11:44 PM, Long, Jianbo <jl7037 at mun.ca> wrote:
> > >
> > > Hello,
> > >
> > > For some of my applications, I need to use petsc without mpi, or use
> it sequentially. I wonder where I can find examples/tutorials for this ?
> >
> > You can run sequentially with just a single MPI process (-n 1).
>
> even if you build with mpich/openmpi - you can run sequentially without
> mpiexec - i.e:
>
> ./binary
>
> One reason to do this [instead of building PETSc with --with-mpi=0] - is
> if you are mixing in multiple pkgs that have MPI dependencies [in which
> case - its best to build all these pkgs with the same mpich or openmpi -
> but still run sequentially].
>
> Satish
>
> > If you need to run without MPI whatsoever, you’ll need to have a
> separate PETSc installation which was configured --with-mpi=0
> > In both cases, the same user-code will run, i.e., all PETSc examples
> available with the sources will work (though some are designed purely for
> parallel experiments and may error out early on purpose).
> >
> > Thanks,
> > Pierre
> >
> > > Thanks very much,
> > > Jianbo Long
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230227/83dc4a50/attachment-0001.html>


More information about the petsc-users mailing list