<div dir="ltr"><div>I'm not sure exactly what you mean, but I'll try to give more details. We have our own DM class (DM_Moose) and we set our own field and domain decomposition routines:</div><div><br></div><div> dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; <br> dm->ops->createdomaindecomposition = DMCreateDomainDecomposition_Moose; <br> </div><div>The field and domain decomposition routines are as follows (can see also at <a href="https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C">https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C</a>):</div><div><br></div><div>static PetscErrorCode<br>DMCreateFieldDecomposition_Moose(<br> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** dmlist)<br>{<br> PetscErrorCode ierr;<br> DM_Moose * dmm = (DM_Moose *)(dm->data);<br><br> PetscFunctionBegin;<br> /* Only called after DMSetUp(). */<br> if (!dmm->_splitlocs)<br> PetscFunctionReturn(0);<br> *len = dmm->_splitlocs->size();<br> if (namelist)<br> {<br> ierr = PetscMalloc(*len * sizeof(char *), namelist);<br> CHKERRQ(ierr);<br> }<br> if (islist)<br> {<br> ierr = PetscMalloc(*len * sizeof(IS), islist);<br> CHKERRQ(ierr);<br> }<br> if (dmlist)<br> {<br> ierr = PetscMalloc(*len * sizeof(DM), dmlist);<br> CHKERRQ(ierr);<br> }<br> for (const auto & dit : *(dmm->_splitlocs))<br> {<br> unsigned int d = dit.second;<br> std::string dname = dit.first;<br> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname];<br> if (!dinfo._dm)<br> {<br> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, &dinfo._dm);<br> CHKERRQ(ierr);<br> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, ((PetscObject)dm)->prefix);<br> CHKERRQ(ierr);<br> std::string suffix = std::string("fieldsplit_") + dname + "_";<br> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, suffix.c_str());<br> CHKERRQ(ierr);<br> }<br> ierr = DMSetFromOptions(dinfo._dm);<br> CHKERRQ(ierr);<br> ierr = DMSetUp(dinfo._dm);<br> CHKERRQ(ierr);<br> if (namelist)<br> {<br> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d);<br> CHKERRQ(ierr);<br> }<br> if (islist)<br> {<br> if (!dinfo._rembedding)<br> {<br> IS dembedding, lembedding;<br> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding);<br> CHKERRQ(ierr);<br> if (dmm->_embedding)<br> {<br> // Create a relative embedding into the parent's index space.<br> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, &lembedding);<br> CHKERRQ(ierr);<br> const PetscInt * lindices;<br> PetscInt len, dlen, llen, *rindices, off, i;<br> ierr = ISGetLocalSize(dembedding, &dlen);<br> CHKERRQ(ierr);<br> ierr = ISGetLocalSize(lembedding, &llen);<br> CHKERRQ(ierr);<br> if (llen != dlen)<br> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed to embed split %D", d);<br> ierr = ISDestroy(&dembedding);<br> CHKERRQ(ierr);<br> // Convert local embedding to global (but still relative) embedding<br> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices);<br> CHKERRQ(ierr);<br> ierr = ISGetIndices(lembedding, &lindices);<br> CHKERRQ(ierr);<br> ierr = PetscMemcpy(rindices, lindices, llen * sizeof(PetscInt));<br> CHKERRQ(ierr);<br> ierr = ISDestroy(&lembedding);<br> CHKERRQ(ierr);<br> // We could get the index offset from a corresponding global vector, but subDMs don't yet<br> // have global vectors<br> ierr = ISGetLocalSize(dmm->_embedding, &len);<br> CHKERRQ(ierr);<br><br> ierr = MPI_Scan(&len,<br> &off,<br> 1,<br>#ifdef PETSC_USE_64BIT_INDICES<br> MPI_LONG_LONG_INT,<br>#else<br> MPI_INT,<br>#endif<br> MPI_SUM,<br> ((PetscObject)dm)->comm);<br> CHKERRQ(ierr);<br><br> off -= len;<br> for (i = 0; i < llen; ++i)<br> rindices[i] += off;<br> ierr = ISCreateGeneral(<br> ((PetscObject)dm)->comm, llen, rindices, PETSC_OWN_POINTER, &(dinfo._rembedding));<br> CHKERRQ(ierr);<br> }<br> else<br> {<br> dinfo._rembedding = dembedding;<br> }<br> }<br> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding));<br> CHKERRQ(ierr);<br> (*islist)[d] = dinfo._rembedding;<br> }<br> if (dmlist)<br> {<br> ierr = PetscObjectReference((PetscObject)dinfo._dm);<br> CHKERRQ(ierr);<br> (*dmlist)[d] = dinfo._dm;<br> }<br> }<br> PetscFunctionReturn(0);<br>}</div><div><br></div><div>static PetscErrorCode<br>DMCreateDomainDecomposition_Moose(<br> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** outerislist, DM ** dmlist)<br>{<br> PetscErrorCode ierr;<br><br> PetscFunctionBegin;<br> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but outerislist, which is currently<br> * PETSC_NULL. */<br> if (outerislist)<br> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */<br> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, innerislist, dmlist);<br> CHKERRQ(ierr);<br> PetscFunctionReturn(0);<br>}<br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay <<a href="mailto:alexlindsay239@gmail.com" target="_blank">alexlindsay239@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>I have errors on quite a few (but not all) processes of the like</div><div><br></div><div>[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[1]PETSC ERROR: Nonconforming object sizes<br>[1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of A00 4129</div><div><br></div><div>when performing field splits. We (MOOSE) have some code for identifying the index sets for each split. However, the code was written by some authors who are no longer with us. Normally I would chase this down in a debugger, but this error only seems to crop up for pretty complex and large meshes. If anyone has an idea for what we might be doing wrong, that might help me chase this down faster. I guess intuitively I'm pretty perplexed that we could get ourselves into this pickle as it almost appears that we have two different local dof index counts for a given block (0 in this case). More background, if helpful, can be found in <a href="https://github.com/idaholab/moose/issues/22359" target="_blank">https://github.com/idaholab/moose/issues/22359</a> as well as <a href="https://github.com/idaholab/moose/discussions/22468" target="_blank">https://github.com/idaholab/moose/discussions/22468</a>.</div></div></blockquote><div><br></div><div>How are you specifying the blocks? I would not have thought this was possible.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>I should note that we are currently running with 3.16.6 as our PETSc submodule hash (we are talking about updating to 3.18 soon).<br></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>