[petsc-users] Better solver and preconditioner to use multiple GPU
Ramoni Z. Sedano Azevedo
ramoni.zsedano at gmail.com
Thu Nov 9 12:54:36 CST 2023
We are solving the Direct Problem of Controlled Source Electromagnetics
(CSEM) using finite difference discretization.
Em qua., 8 de nov. de 2023 às 13:22, Jed Brown <jed at jedbrown.org> escreveu:
> What sort of problem are you solving? Algebraic multigrid like gamg or
> hypre are good choices for elliptic problems. Sparse triangular solves have
> horrific efficiency even on one GPU so you generally want to do your best
> to stay away from them.
>
> "Ramoni Z. Sedano Azevedo" <ramoni.zsedano at gmail.com> writes:
>
> > Hey!
> >
> > I am using PETSC in Fortran code and we apply the MPI process to
> > parallelize the code.
> >
> > At the moment, the options that have been used are
> > -ksp_monitor_true_residual
> > -ksp_type bcgs
> > -pc_type bjacobi
> > -sub_pc_type ilu
> > -sub_pc_factor_levels 3
> > -sub_pc_factor_fill 6
> >
> > Now, we want to use multiple GPUs and I would like to know if there is a
> > better solver and preconditioner pair to apply in this case.
> >
> > Yours sincerely,
> > Ramoni Z. S . Azevedo
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231109/2af6c324/attachment.html>
More information about the petsc-users
mailing list