MeshDistribute() and Chaco

Shi Jin jinzishuai at yahoo.com
Thu Jun 14 22:24:44 CDT 2007


Hi,

I am trying to study the new unstructured mesh part
provided by the new Petsc. I have one particular
question  with regard to the function call
MeshDistribute(serialMesh, PETSC_NULL, &parallelMesh);

This call obviously needs the chaco package to run in
parallel since otherwise I will get a warning to
configure with --download-chaco.
So I ran it with chaco installed. The code works but
the domain decomposition is valid but far from good.
In the attached image, I am showing the domain
decomposition for  a 2-dimensional 1x1 box with two
processes, using the following statements:
MeshCreatePCICE(comm, 3,
"bratu_2d.nodes","bratu_2d.lcon",PETSC_FALSE,PETSC_FALSE,
&serialMesh);
MeshDistribute(serialMesh, PETSC_NULL, &parallelMesh);

I am wondering if it is possible to improve the domain
decomposition by passing some command arguments. I
realized that there are Chaco options such as 
-mat_partitioning_chaco_global (found at
http://www-unix.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/MatOrderings/MAT_PARTITIONING_Chaco.html)
. However, when I pass these options to my code, it is
not recognized. I wonder how chaco is used in Petsc
and how I can change its behavior. 

In addition, is it possible to use other graph
decomposition packages such as ParMetis to implement
MeshDistribute()? I tried to enable ParMetis without
chaco but the code didn't run and warned me to install
chaco.

Any advice is valuable. Thank you very much.

Shi


 
____________________________________________________________________________________
Finding fabulous fares is fun.  
Let Yahoo! FareChase search your favorite travel sites to find flight and hotel bargains.
http://farechase.yahoo.com/promo-generic-14795097
-------------- next part --------------
A non-text attachment was scrubbed...
Name: NP2.png
Type: image/png
Size: 41739 bytes
Desc: 88704069-NP2.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070614/d80acd8d/attachment.png>


More information about the petsc-users mailing list