[MOAB-dev] Fwd: About reading vtk in parallel mode.
Иля
ilacai at mail.ru
Fri Sep 6 00:01:21 CDT 2013
I found some information about iZoltan
http://lists.mcs.anl.gov/pipermail/moab-dev/2012/004439.html
Maybe I should use this way for loading vtk files?
_______________________________
Best regards, Ilya Romero
Nuclear Safety Institute
Russian Academy of Sciences
-------- Пересылаемое сообщение --------
От кого: Иля <ilacai at mail.ru>
Кому: MOAB-dev at mcs.anl.gov
Дата: Пятница, 6 сентября 2013, 8:53 +04:00
Тема: About reading vtk in parallel mode.
I am new in MOAB and don't have enough experience in your program product. And I'm sorry for my maybe silly questions.
Could you please help me with the following situation. The main idea is using solution coupling in my research. I found workable example in MOAB package: {$MOAB_dir}/tools/mbcoupler/mbcoupler_test.cpp. So at the first, I decided just to change meshes in this example. And I tryed to understand how to use meshes in parallel mode.
1) I had the geometry of both new meshes in vtk format: geom1.vtk, geom2.vtk. Using MBConverter I converted them into vtk format, supporting by MOAB
./mbconvert -f vtk geom1.vtk geom1_c.vtk
2)Using MBZoltan I converted new vtk files into h5m format, dividing in 4 parts
./ mbpart 4 - T geom 1_c. vtk geom 1_ p . h 5 m
3) I read geom1_p.h5m with Tecplot and checked PARALLEL_PARTITION tag existance (according MOAB_UG.doc). There are values: 0,1,2,3, that is correct. So I tryed to use 2 variants of reading parameters:
«PARALLEL=READ_PART; PARTITION=PARALLEL_PARTITION; PARTITION_DISTRIBUTE;CPUTIME»
and
«PARALLEL=READ_PART; PARTITION=PARALLEL_PARTITION; PARTITION_VAL=0,1,2,3;CPUTIME»
4) Then I used these meshes with mbcoupler_test
mpirun -as intel -np 2 -q test ./mbcoupler_test -meshes geom1_p.h5m geom2_p.h5m -itag vertex_field -outfile test.h5m
But these meshes weren't read with program. The listing file has the following information:
Parallel Read times:
0.030983 PARALLEL READ PART
0.030983 PARALLEL TOTAL
Parallel Read times:
0.00436592 PARALLEL READ PART
0.00436592 PARALLEL TOTAL
Proc 0 iface entities:
0 0d iface entities.
0 1d iface entities.
0 2d iface entities.
0 3d iface entities.
(0 verts adj to other iface ents)
Proc 1 iface entities:
0 0d iface entities.
0 1d iface entities.
0 2d iface entities.
0 3d iface entities.
(0 verts adj to other iface ents)
Failure; message:
No tag value for root set
Failure; message:
No tag value for root set
application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
application called MPI_Abort(MPI_COMM_WORLD, 6) - process 1
rank 1 in job 1 t60-2.parallel.ru_43452 caused collective abort of all ranks
exit status of rank 1: return code 6
5) If I set readOpts = "PARALLEL=READ_PART;PARTITION=PARALLEL_PARTITION;PARTITION_DISTRIBUTE;PARALLEL_RESOLVE_SHARED_ENTS;PARALLEL_GHOSTS=3.0.1;CPUTIME"; the meshes are read, but option PARALLEL_RESOLVE_SHARED_ENTS gives errror
Parallel Read times:
0.198586 PARALLEL READ PART
0.026906 PARALLEL RESOLVE_SHARED_ENTS
0.00115585 PARALLEL EXCHANGE_GHOSTS
9.60827e-05 PARALLEL RESOLVE_SHARED_SETS
0.226157 PARALLEL TOTAL
Parallel Read times:
0.00666714 PARALLEL READ PART
0.00862002 PARALLEL RESOLVE_SHARED_ENTS
0.00228691 PARALLEL EXCHANGE_GHOSTS
4.41074e-05 PARALLEL RESOLVE_SHARED_SETS
0.0157781 PARALLEL TOTAL
Proc 0 iface entities:
96 0d iface entities.
154 1d iface entities.
60 2d iface entities.
0 3d iface entities.
(96 verts adj to other iface ents)
Proc 1 iface entities:
96 0d iface entities.
154 1d iface entities.
60 2d iface entities.
0 3d iface entities.
(96 verts adj to other iface ents)
application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
application called MPI_Abort(MPI_COMM_WORLD, 6) - process 1
Failure; message:
No sparse tag __PARALLEL_SHARED_PROCS value for Vertex 969
Failure; message:
No sparse tag __PARALLEL_SHARED_PROCS value for Vertex 800
rank 1 in job 1 t60-2.parallel.ru_41693 caused collective abort of all ranks
exit status of rank 1: return code 6
5) Using this information I checked my h5m files with MBconverter
mpirun -np 2 -q test -as intel ./mbconvert -O PARALLEL=READ_PART -O PARTITION=PARALLEL_PARTITION -O PARALLEL_RESOLVE_SHARED_ENTS -O PARALLEL_GHOSTS=3.0.1 -o PARALLEL=WRITE_PART geom1_p.h5m geom1_last.h5m
The result is successful:
Read "geom1_p.h5m"
Wrote "geom1_last.h5m" 6) So could you please explain how I should correct read meshes in parallel? Form vtk to h5m parallel mesh (I would like to use READ_PART method ). What do I do wrong? (All using files are applied with this letter)
Thank you.
_______________________________
Best regards, Ilya Romero
Nuclear Safety Institute
Russian Academy of Sciences
----------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: geom1_c.vtk
Type: application/octet-stream
Size: 12074 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0006.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mbcoupler_test.cpp
Type: application/octet-stream
Size: 27666 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0007.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: geom1_p.h5m
Type: application/octet-stream
Size: 49924 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0008.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: geom2_c.vtk
Type: application/octet-stream
Size: 74572 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0009.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: geom2_p.h5m
Type: application/octet-stream
Size: 118404 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0010.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mbcoupler_test.lising
Type: application/octet-stream
Size: 812 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20130906/df1ac0ee/attachment-0011.obj>
More information about the moab-dev
mailing list