[petsc-users] Partitioning a mesh with petsc4py

Javier Quinteros javier at gfz-potsdam.de
Wed Jul 8 06:53:06 CDT 2015


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Matthew.
Thanks for the answer. We've checked again and actually it is working
with chaco. But we haven't found any example/way to use it with metis,
that it's what we wanted.

I've seen that there is a new Partitioner class, but I could not find
how to pass it to the DMPlex (in petsc4py).

Thanks in advance for any hint.

Javier


Am 08.07.2015 um 12:47 schrieb Matthew Knepley:
> On Wed, Jul 8, 2015 at 5:16 AM, Javier Quinteros
> <javier at gfz-potsdam.de <mailto:javier at gfz-potsdam.de>> wrote:
> 
> Hi all. We use petsc4py to develop. We are trying to partition a
> mesh (DMPlex) but we receive different errors when calling to the
> "distribute" method of the "DMPlex" class.
> 
> We use versions 3.6 from PETSc and petsc4py. We run it in two
> different installations. One has been installed with 
> --download-metis and the other with --download-chaco, but the error
> is always:
> 
> File "PETSc/DMPlex.pyx", line 514, in 
> petsc4py.PETSc.DMPlex.distribute (src/petsc4py.PETSc.c:220066) 
> self.dm.distribute() File "PETSc/DMPlex.pyx", line 514, in 
> petsc4py.PETSc.DMPlex.distribute (src/petsc4py.PETSc.c:220066) 
> petsc4py.PETSc.Errorpetsc4py.PETSc.Error: error code 56 [0]
> DMPlexDistribute() line 1505 in 
> /home/javier/soft/petsc-3.6.0/src/dm/impls/plex/plexdistribute.c 
> [0] PetscPartitionerPartition() line 653 in 
> /home/javier/soft/petsc-3.6.0/src/dm/impls/plex/plexpartition.c [0]
> PetscPartitionerPartition_Chaco() line 1048 in 
> /home/javier/soft/petsc-3.6.0/src/dm/impls/plex/plexpartition.c [0]
> No support for this operation for this object type [0] Mesh
> partitioning needs external package support. Please reconfigure
> with --download-chaco.
> 
> 
>> Can you send the configure.log from the installation using
>> --download-chaco?
> 
>> Thanks,
> 
> Matt
> 
> 
> We build our mesh using "createFromCellList" as in firedrake (plex
> is empty on ranks different than 0), and there is the folowwing
> comment:
> 
> # Provide empty plex on other ranks # A subsequent call to
> plex.distribute() takes care of parallel # partitioning
> 
> But for us is not working.
> 
> Does anyone have a working example of how to partition a mesh in 
> petsc4py?
> 
> Thanks in advance!
> 
> 
> 
> 
> 
> -- What most experimenters take for granted before they begin
> their experiments is infinitely more interesting than any results
> to which their experiments lead. -- Norbert Wiener

- -- 
Javier Quinteros
- -------------------------------------------
2.4/Seismologie
Tel.: +49 (0)331/288-1931
Fax:  +49 (0)331/288-1277
Email: javier at gfz-potsdam.de
___________________________________

Helmholtz-Zentrum Potsdam
Deutsches GeoForschungsZentrum GFZ
Stiftung des öff. Rechts Land Brandenburg
Telegrafenberg, 14473 Potsdam
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJVnQ8iAAoJEB7cuSNr8cJ5NZ8IAL1i+a5aU6WHw0qIWMOrmgcA
DYzHEh3h+5Oz2ISwu1c66EUYD4FrXKWZPeJuzBxN/43B4P2fPG96tKC/fO5u5Y4A
ddarZhINithc8Nr9jZAPYR+SleaiWv3+3gQd2XYw530dVBofnM9J6qY1a3Y3qEbe
ym54vvC5hameQtZGqecQ/ldIg56V9bCvIIXqtD2GJ09QASxp5ZKQNvkn53E2uwgB
JpEIYyT4ys0vC1lmFHxGQI3KXzioZQNjlVoWtU5wDJ5YNKK7wuYDgMydFuY2ZigK
DYMLxjm8Y49byX9X5gVq6NT3x1FVgNBymrJ3Spfm7SenmksMrNXWWYc3XkYSAy8=
=tw0x
-----END PGP SIGNATURE-----


More information about the petsc-users mailing list