[petsc-users] Questions regarding nested field split
Barry Smith
bsmith at petsc.dev
Fri Feb 25 15:16:23 CST 2022
For parallel layouts generally each MPI rank (parallel process) will have roughly the same number of indices for each field. So you would not end up with entire fields on one or a small number of processes. This is done by using interlaced (check the docs) storage of variables rather by having all of the first type of variable, followed by all the second type of variable etc.
For parallel domain decomposition based preconditioners one usually uses PCBJACOBI or PCASM which can automatically split the unknowns by rank.
Barry
> On Feb 25, 2022, at 2:38 PM, Sundar Namala <solomon.sundar.n at gmail.com> wrote:
>
> Hi, I am currently using fieldsplit and I am creating the fields using ISCreateGeneral.programming is being carried out in FORTRAN. I have a couple of questions regarding fieldsplit in parallel.
>
> Do we need to create the index list of all the fields separately for each processor?
>
> For example, say I have 3 fields and the indices for field_0 is 0-99, field_1 is 100-299 and field_2 is 300-349. In case of 2 processors do I have to specify the indices for the first processor as field_0 is 0-99, field_1 is 100-174 and field_2 is null. On the second processor field_0 is null, field_1 is 175-299 and field_2 is 300-349.
>
> my second question is if the indices need to be listed separately how do you assign the null index list using ISCreateGeneral.
>
> Thanks,
> Sundar.
More information about the petsc-users
mailing list