[MOAB-dev] Fwd: About reading vtk in parallel mode.
Iulian Grindeanu
iulian at mcs.anl.gov
Fri Sep 6 11:22:53 CDT 2013
Hi Ilya,
One problems is that the meshes do not cover the same domain. Is this intentional?
Also, you need to use the utility addfield, to add the vertextag and element tags used by default.
After using the addfield utility, you can check your tags with
mbsize -t geom1 _p.h5m
So the idea is to transfer one field from one mesh (source) to the other (target). The result will be saved in a new file, which should contain the transferred field.
The files used in the test are something like this (source)
mbsize -t ../../../moab/MeshFiles/unittest/64bricks_1khex.h5m
File ../../../moab/MeshFiles/unittest/64bricks_1khex.h5m:
Tag Name Vertex Edge Quad Hex EntitySet
------------------ -------- -------- -------- -------- ---------
MATERIAL_SET 0 0 0 0 64
NEUMANN_SET 0 0 0 0 3
DIRICHLET_SET 0 0 0 0 3
GEOM_DIMENSION 0 0 0 0 729
GLOBAL_ID 2197 3780 2160 1728 1056
ATTRIB_VECTOR 0 0 0 0 721
CATEGORY 0 0 0 0 799
PARALLEL_PARTITION 0 0 0 0 256
UNIQUE_ID 0 0 0 0 729
element_field 0 0 0 1728 0
vertex_field 2197 0 0 0 0
You can check the test in parallel (on 2 processors) in the test/parallel/par_coupler_test
Best Regards,
Iulian
----- Original Message -----
| I found some information about iZoltan
| http://lists.mcs.anl.gov/pipermail/moab-dev/2012/004439.html
| Maybe I should use this way for loading vtk files?
| _______________________________
| Best regards, Ilya Romero
| Nuclear Safety Institute
| Russian Academy of Sciences
| -------- Пересылаемое сообщение --------
| От кого: Иля <ilacai at mail.ru>
| Кому: MOAB-dev at mcs.anl.gov
| Дата: Пятница, 6 сентября 2013, 8:53 +04:00
| Тема: About reading vtk in parallel mode.
| I am new in MOAB and don't have enough experience in your program
| product. And I'm sorry for my maybe silly questions.
| Could you please help me with the following situation. The main idea
| is using solution coupling in my research. I found workable example
| in MOAB package: {$MOAB_dir}/tools/mbcoupler/mbcoupler_test.cpp. So
| at the first, I decided just to change meshes in this example. And I
| tryed to understand how to use meshes in parallel mode.
| 1) I had the geometry of both new meshes in vtk format: geom1.vtk,
| geom2.vtk. Using MBConverter I converted them into vtk format,
| supporting by MOAB
| ./mbconvert -f vtk geom1.vtk geom1_c.vtk
| 2)Using MBZoltan I converted new vtk files into h5m format, dividing
| in 4 parts
| ./ mbpart 4 - T geom 1_c. vtk geom 1_ p . h 5 m
| 3) I read geom1_p.h5m with Tecplot and checked PARALLEL_PARTITION tag
| existance (according MOAB_UG.doc). There are values: 0,1,2,3, that
| is correct. So I tryed to use 2 variants of reading parameters:
| «PARALLEL=READ_PART; PARTITION=PARALLEL_PARTITION;
| PARTITION_DISTRIBUTE;CPUTIME»
| and
| «PARALLEL=READ_PART; PARTITION=PARALLEL_PARTITION;
| PARTITION_VAL=0,1,2,3;CPUTIME»
| 4) Then I used these meshes with mbcoupler_test
| mpirun -as intel -np 2 -q test ./mbcoupler_test -meshes geom1_p.h5m
| geom2_p.h5m -itag vertex_field -outfile test.h5m
| But these meshes weren't read with program. The listing file has the
| following information:
| Parallel Read times:
| 0.030983 PARALLEL READ PART
| 0.030983 PARALLEL TOTAL
| Parallel Read times:
| 0.00436592 PARALLEL READ PART
| 0.00436592 PARALLEL TOTAL
| Proc 0 iface entities:
| 0 0d iface entities.
| 0 1d iface entities.
| 0 2d iface entities.
| 0 3d iface entities.
| (0 verts adj to other iface ents)
| Proc 1 iface entities:
| 0 0d iface entities.
| 0 1d iface entities.
| 0 2d iface entities.
| 0 3d iface entities.
| (0 verts adj to other iface ents)
| Failure; message:
| No tag value for root set
| Failure; message:
| No tag value for root set
| application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
| application called MPI_Abort(MPI_COMM_WORLD, 6) - process 1
| rank 1 in job 1 t60-2.parallel.ru_43452 caused collective abort of
| all ranks
| exit status of rank 1: return code 6
| 5) If I set readOpts =
| "PARALLEL=READ_PART;PARTITION=PARALLEL_PARTITION;PARTITION_DISTRIBUTE;PARALLEL_RESOLVE_SHARED_ENTS;PARALLEL_GHOSTS=3.0.1;CPUTIME";
| the meshes are read, but option PARALLEL_RESOLVE_SHARED_ENTS gives
| errror
| Parallel Read times:
| 0.198586 PARALLEL READ PART
| 0.026906 PARALLEL RESOLVE_SHARED_ENTS
| 0.00115585 PARALLEL EXCHANGE_GHOSTS
| 9.60827e-05 PARALLEL RESOLVE_SHARED_SETS
| 0.226157 PARALLEL TOTAL
| Parallel Read times:
| 0.00666714 PARALLEL READ PART
| 0.00862002 PARALLEL RESOLVE_SHARED_ENTS
| 0.00228691 PARALLEL EXCHANGE_GHOSTS
| 4.41074e-05 PARALLEL RESOLVE_SHARED_SETS
| 0.0157781 PARALLEL TOTAL
| Proc 0 iface entities:
| 96 0d iface entities.
| 154 1d iface entities.
| 60 2d iface entities.
| 0 3d iface entities.
| (96 verts adj to other iface ents)
| Proc 1 iface entities:
| 96 0d iface entities.
| 154 1d iface entities.
| 60 2d iface entities.
| 0 3d iface entities.
| (96 verts adj to other iface ents)
| application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
| application called MPI_Abort(MPI_COMM_WORLD, 6) - process 1
| Failure; message:
| No sparse tag __PARALLEL_SHARED_PROCS value for Vertex 969
| Failure; message:
| No sparse tag __PARALLEL_SHARED_PROCS value for Vertex 800
| rank 1 in job 1 t60-2.parallel.ru_41693 caused collective abort of
| all ranks
| exit status of rank 1: return code 6
| 5) Using this information I checked my h5m files with MBconverter
| mpirun -np 2 -q test -as intel ./mbconvert -O PARALLEL=READ_PART -O
| PARTITION=PARALLEL_PARTITION -O PARALLEL_RESOLVE_SHARED_ENTS -O
| PARALLEL_GHOSTS=3.0.1 -o PARALLEL=WRITE_PART geom1_p.h5m
| geom1_last.h5m
| The result is successful:
| Read "geom1_p.h5m"
| Wrote "geom1_last.h5m" 6) So could you please explain how I should
| correct read meshes in parallel? Form vtk to h5m parallel mesh (I
| would like to use READ_PART method ). What do I do wrong? (All using
| files are applied with this letter)
| Thank you.
| _______________________________
| Best regards, Ilya Romero
| Nuclear Safety Institute
| Russian Academy of Sciences
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/d5970c7e/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: meshes0000.png
Type: image/png
Size: 78072 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/d5970c7e/attachment-0001.png>
More information about the moab-dev
mailing list