[petsc-users] [petsc4py] Assembly fails

Alejandro Aragon - 3ME A.M.Aragon at tudelft.nl
Sat Mar 28 12:29:55 CDT 2020


Dear Matthew,

Thanks for your email. I tried first what you suggested and it didn’t work. However, I actually tried also commenting the same function in Section and worked! This is the working code:


def createfields(dm):
    """Set up the solution field"""

    dim = dm.getDimension()
    # The number of solution fields
    numFields = 1
    dm.setNumFields(numFields)
    # numComp - An array of size numFields that holds the number of components for each field
    numComp = np.array([dim], dtype=np.int32)
    # numDof - An array of size numFields*(dim+1) which holds
    # the number of dof for each field on a mesh piece of dimension d
    numDof = np.zeros(numFields*(dim+1), dtype=np.int32)
    numDof[0] = dim # u is defined on vertices

    # Create a PetscSection based upon the dof layout specification provided
    # PetscSection: Mapping from integers in a designated range to contiguous sets of integers
    section = dm.createSection(numComp, numDof)
    # Sets the name of a field in the PetscSection, 0 is the field number and "u" is the field name
    section.setFieldName(0, "u")
    # Set the PetscSection encoding the local data layout for the DM
    dm.setDefaultSection(section)

    return dm

The question I now have is whether PETSc 3.13 and the matching petsc4py will change functionality to the point that this code will no longer work. Would that be the case?

Best regards,

— Alejandro

On 28 Mar 2020, at 18:14, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

On Fri, Mar 27, 2020 at 10:09 AM Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:
On Fri, Mar 27, 2020 at 3:31 AM Alejandro Aragon - 3ME <A.M.Aragon at tudelft.nl<mailto:A.M.Aragon at tudelft.nl>> wrote:
Dear Matthew,

Thanks for your email. I have attached the python code that reproduces the following error in my computer:

I think I see the problem. There were changes in DM in order to support fields which only occupy part of the domain.
Now you need to tell the DM about the fields before it builds a Section. I think in your code, you only need

  f = PetscContainer()
  f.setName("potential")
  dm.addField(field = f)

So Nicolas Barral found a much better way to do this. You only need

  dm.setNumFields(1)

  Thanks,

     Matt

from https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMAddField.html<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mcs.anl.gov_petsc_petsc-2Dcurrent_docs_manualpages_DM_DMAddField.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=U1Pry6bTDByPujjSnTcPY6KNXqrym6APfmtom-lIPFA&m=UZjBC-wnFeOXp0s_J1LfjZeUVWAtsWXRKX6gZQ64z7Q&s=ICc46rZbM067ZBpFcy0TKivavc07pbkZe1e8Bu4SmIA&e=> before the createSection().
My Python may not be correct since I never use that interface.

  Thanks,

     Matt

(.pydev) ➜  dmplex_fem mpirun -np 2  python Cpp2Python.py
Traceback (most recent call last):
  File "Cpp2Python.py", line 383, in <module>
    sys.exit(Cpp2Python())
  File "Cpp2Python.py", line 357, in Cpp2Python
    dm = createfields(dm)
  File "Cpp2Python.py", line 62, in createfields
    section.setFieldName(0, "u")
  File "PETSc/Section.pyx", line 59, in petsc4py.PETSc.Section.setFieldName
petsc4py.PETSc.Error: error code 63
[1] PetscSectionSetFieldName() line 427 in /private/tmp/pip-install-laf1l3br/petsc/src/vec/is/section/interface/section.c
[1] Argument out of range
[1] Section field 0 should be in [0, 0)
Traceback (most recent call last):
  File "Cpp2Python.py", line 383, in <module>
    sys.exit(Cpp2Python())
  File "Cpp2Python.py", line 357, in Cpp2Python
    dm = createfields(dm)
  File "Cpp2Python.py", line 62, in createfields
    section.setFieldName(0, "u")
  File "PETSc/Section.pyx", line 59, in petsc4py.PETSc.Section.setFieldName
petsc4py.PETSc.Error: error code 63
[0] PetscSectionSetFieldName() line 427 in /private/tmp/pip-install-laf1l3br/petsc/src/vec/is/section/interface/section.c
[0] Argument out of range
[0] Section field 0 should be in [0, 0)
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[23972,1],0]
  Exit code:    1
--------------------------------------------------------------------------

I’m using Python 3.8 and this is the output of ‘pip freeze'

(.pydev) ➜  dmplex_fem pip freeze
cachetools==4.0.0
cycler==0.10.0
kiwisolver==1.1.0
llvmlite==0.31.0
matplotlib==3.2.1
mpi4py==3.0.3
numba==0.48.0
numpy==1.18.2
petsc==3.12.4
petsc4py==3.12.0
plexus==0.1.0
pyparsing==2.4.6
python-dateutil==2.8.1
scipy==1.4.1
six==1.14.0

I’m looking forward to getting your insight on the issue.
Best regards,

— Alejandro



On 25 Mar 2020, at 17:37, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

On Wed, Mar 25, 2020 at 12:29 PM Alejandro Aragon - 3ME <A.M.Aragon at tudelft.nl<mailto:A.M.Aragon at tudelft.nl>> wrote:
Dear everyone,

I’m new to petsc4py and I’m trying to run a simple finite element code that uses DMPLEX to load a .msh file (created by Gmsh). In version 3.10 the code was working but I recently upgraded to 3.12 and I get the following error:

(.pydev) ➜  testmodule git:(e0bc9ae) ✗ mpirun -np 2 python testmodule/__main__.py
{3: <testmodule.constitutive.elastic.Elastic object at 0x10feea520>}
{3: <testmodule.constitutive.elastic.Elastic object at 0x10d96d520>}
Traceback (most recent call last):
  File "testmodule/__main__.py", line 32, in <module>
    sys.exit(main(sys.argv))
  File "testmodule/__main__.py", line 29, in main
    step.solve(m)
  File "/Users/aaragon/Local/testmodule/testmodule/fem/analysis/static.py", line 33, in solve
    self.Amat.assemblyBegin(assembly=0)  # FINAL_ASSEMBLY = 0
  File "PETSc/Mat.pyx", line 1039, in petsc4py.PETSc.Mat.assemblyBegin
petsc4py.PETSc.Error: error code 63
[1] MatAssemblyBegin() line 5182 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/interface/matrix.c
[1] MatAssemblyBegin_MPIAIJ() line 810 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/impls/aij/mpi/mpiaij.c
[1] MatStashScatterBegin_Private() line 462 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/utils/matstash.c
[1] MatStashScatterBegin_BTS() line 931 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/utils/matstash.c
[1] PetscCommBuildTwoSidedFReq() line 555 in /private/tmp/pip-install-zurcx_6k/petsc/src/sys/utils/mpits.c
[1] Argument out of range
[1] toranks[0] 2 not in comm size 2
Traceback (most recent call last):
  File "testmodule/__main__.py", line 32, in <module>
    sys.exit(main(sys.argv))
  File "testmodule/__main__.py", line 29, in main
    step.solve(m)
  File "/Users/aaragon/Local/testmodule/testmodule/fem/analysis/static.py", line 33, in solve
    self.Amat.assemblyBegin(assembly=0)  # FINAL_ASSEMBLY = 0
  File "PETSc/Mat.pyx", line 1039, in petsc4py.PETSc.Mat.assemblyBegin
petsc4py.PETSc.Error: error code 63
[0] MatAssemblyBegin() line 5182 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/interface/matrix.c
[0] MatAssemblyBegin_MPIAIJ() line 810 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/impls/aij/mpi/mpiaij.c
[0] MatStashScatterBegin_Private() line 462 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/utils/matstash.c
[0] MatStashScatterBegin_BTS() line 931 in /private/tmp/pip-install-zurcx_6k/petsc/src/mat/utils/matstash.c
[0] PetscCommBuildTwoSidedFReq() line 555 in /private/tmp/pip-install-zurcx_6k/petsc/src/sys/utils/mpits.c
[0] Argument out of range
[0] toranks[0] 2 not in comm size 2
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[46994,1],0]
  Exit code:    1
--------------------------------------------------------------------------


This is in the call to assembly, which looks like this:


# Begins assembling the matrix. This routine should be called after completing all calls to MatSetValues().
self.Amat.assemblyBegin(assembly=0)  # FINAL_ASSEMBLY = 0
# Completes assembling the matrix. This routine should be called after MatAssemblyBegin().
self.Amat.assemblyEnd(assembly=0)

I would appreciate if someone can give me some insight on what has changed in the new version of petsc4py (or petsc for that matter) to make this code work again.

It looks like you have an inconsistent build, or a memory overwrite. Since you are in Python, I suspect the former. Can you build
PETSc from scratch and try this? Does it work in serial? Can you send a small code that reproduces this?

  Thanks,

     Matt

Best regards,

— Alejandro



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cse.buffalo.edu_-7Eknepley_&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=U1Pry6bTDByPujjSnTcPY6KNXqrym6APfmtom-lIPFA&m=WPJk3cZzX4wkA5n5wceeOf2wjrRxHYug5Gs78b9WHlc&s=vuuUnGL1h2Bfv_uY7e5cMSfiF1btYPrvwf5vKy1JoN0&e=>



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cse.buffalo.edu_-7Eknepley_&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=U1Pry6bTDByPujjSnTcPY6KNXqrym6APfmtom-lIPFA&m=UZjBC-wnFeOXp0s_J1LfjZeUVWAtsWXRKX6gZQ64z7Q&s=pgBjejAN8aAlWZNjhIiXXP8IhXJrAr2HeI7cLhj0wlM&e=>


--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cse.buffalo.edu_-7Eknepley_&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=U1Pry6bTDByPujjSnTcPY6KNXqrym6APfmtom-lIPFA&m=UZjBC-wnFeOXp0s_J1LfjZeUVWAtsWXRKX6gZQ64z7Q&s=pgBjejAN8aAlWZNjhIiXXP8IhXJrAr2HeI7cLhj0wlM&e=>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200328/8b184568/attachment-0001.html>


More information about the petsc-users mailing list