[petsc-dev] Controlling matrix type on different levels of multigrid hierarchy? (Motivation is GPUs)
Matthew Knepley
knepley at gmail.com
Wed Jun 12 11:42:13 CDT 2019
On Wed, Jun 12, 2019 at 12:38 PM Mills, Richard Tran via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:
> Colleagues,
>
> I think we ought to have a way to control which levels of a PETSc
> multigrid solve happen on the GPU vs. the CPU, as I'd like to keep coarse
> levels on the CPU, but run the calculations for finer levels on the GPU.
> Currently, for a code that is using a DM to manage its grid, one can use
> GPUs inside the application of PCMG by doing putting something like
>
> -dm_mat_type aijcusparse -dm_vec_type cuda
>
> on the command line. What I'd like to be able to do is to also control
> which levels get plain AIJ matrices and which get a GPU type, maybe via
> something like
>
> -mg_levels_N_dm_mat_type aijcusparse -mg_levels_N_dm_mat_type cuda
>
> for level N. (Being able to specify a range of levels would be even nicer,
> but let's start simple.)
>
> Maybe doing the above is as simple as making sure that DMSetFromOptions()
> gets called for the DM for each level. But I think I may be not
> understanding some sort of additional complications. Can someone who knows
> the PCMG framework better chime in? Or do others have ideas for a more
> elegant way of giving this sort of control to the user?
>
PCMG does not call SetFromOptions() on the dms it creates with DMCoarsen().
It should. We just check the flag for the original
DM to see if it was set from options first. Search for DMCoarsen() in mg.c
and stick it in.
Thanks,
Matt
> Best regards,
> Richard
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190612/ddb94452/attachment.html>
More information about the petsc-dev
mailing list