<div dir="ltr"><div dir="ltr">On Sun, Nov 5, 2023 at 9:54 PM Gong Yujie <<a href="mailto:yc17470@connect.um.edu.mo">yc17470@connect.um.edu.mo</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div class="msg-3026526457212217057">
<div dir="ltr">
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
Dear PETSc developers,</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
<br>
</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
I'm trying to output a result data in vtk format and find that it is quite slow. Then I try to check this issue by a simple test code:</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
<br>
</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
<i> PetscCall(PetscInitialize(&argc,&argv,(char*)0,NULL));</i>
<div><i> DM dm,dmParallel,dmAux;</i></div>
<div><i> PetscBool interpolate=PETSC_TRUE;</i></div>
<div><i> PetscCall(DMPlexCreateExodusFromFile(PETSC_COMM_WORLD,"artery_plaque.exo",interpolate,&dm));</i></div>
<div><i> PetscCall(DMViewFromOptions(dm,NULL,"-dm_view"));</i></div>
<i> PetscCall(PetscFinalize());</i><br>
</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
and run with <i>./dm_test -dm_view vtk:./ksp_data/abc.vtk -log_view</i></div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
<br>
</div>
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
It took about 600s to output the mesh. I'm not sure if there is something wrong in my code or my configuration of PETSc. Could you please give me some advice on this?</div></div></div></blockquote><div><br></div><div>VTK is an ASCII format, and the mesh is not small. The file size may be causing problems on your system. What if you choose VTU instead? I now mostly use HDF5, and the utility that creates an XDMF to match it.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div class="msg-3026526457212217057"><div dir="ltr">
<div style="font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif;font-size:12pt;color:rgb(0,0,0)">
Best Regards,
<div>Yujie</div>
<div><br>
</div>
<div>P.S. The result for log_view</div>
<div>****************************************************************************************************************************************************************
<div>*** WIDEN YOUR WINDOW TO 160 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***</div>
<div>****************************************************************************************************************************************************************</div>
<div><br>
</div>
<div>------------------------------------------------------------------ PETSc Performance Summary: ------------------------------------------------------------------</div>
<div><br>
</div>
<div>./dm_test on a arch-linux-c-opt named DESKTOP-0H8HCOD with 1 processor, by qingfeng Mon Nov 6 10:43:31 2023</div>
<div>Using Petsc Release Version 3.19.5, unknown</div>
<div><br>
</div>
<div> Max Max/Min Avg Total</div>
<div>Time (sec): 6.286e+02 1.000 6.286e+02</div>
<div>Objects: 1.400e+02 1.000 1.400e+02</div>
<div>Flops: 0.000e+00 0.000 0.000e+00 0.000e+00</div>
<div>Flops/sec: 0.000e+00 0.000 0.000e+00 0.000e+00</div>
<div>MPI Msg Count: 0.000e+00 0.000 0.000e+00 0.000e+00</div>
<div>MPI Msg Len (bytes): 0.000e+00 0.000 0.000e+00 0.000e+00</div>
<div>MPI Reductions: 0.000e+00 0.000</div>
<div><br>
</div>
<div>Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)</div>
<div> e.g., VecAXPY() for real vectors of length N --> 2N flops</div>
<div> and VecAXPY() for complex vectors of length N --> 8N flops</div>
<div><br>
</div>
<div>Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions --</div>
<div> Avg %Total Avg %Total Count %Total Avg %Total Count %Total</div>
<div> 0: Main Stage: 6.2859e+02 100.0% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0%</div>
<div><br>
</div>
<div>------------------------------------------------------------------------------------------------------------------------</div>
<div>See the 'Profiling' chapter of the users' manual for details on interpreting output.</div>
<div>Phase summary info:</div>
<div> Count: number of times phase was executed</div>
<div> Time and Flop: Max - maximum over all processors</div>
<div> Ratio - ratio of maximum to minimum over all processors</div>
<div> Mess: number of messages sent</div>
<div> AvgLen: average message length (bytes)</div>
<div> Reduct: number of global reductions</div>
<div> Global: entire computation</div>
<div> Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().</div>
<div> %T - percent time in this phase %F - percent flop in this phase</div>
<div> %M - percent messages in this phase %L - percent message lengths in this phase</div>
<div> %R - percent reductions in this phase</div>
<div> Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors)</div>
<div>------------------------------------------------------------------------------------------------------------------------</div>
<div>Event Count Time (sec) Flop --- Global --- --- Stage ---- Total</div>
<div> Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s</div>
<div>------------------------------------------------------------------------------------------------------------------------</div>
<div><br>
</div>
<div>--- Event Stage 0: Main Stage</div>
<div><br>
</div>
<div>DMPlexInterp 1 1.0 3.1186e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div>
<div>DMPlexStratify 3 1.0 4.2802e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div>
<div>DMPlexSymmetrize 3 1.0 1.0806e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div>
<div>------------------------------------------------------------------------------------------------------------------------</div>
<div><br>
</div>
<div>Object Type Creations Destructions. Reports information only for process 0.</div>
<div><br>
</div>
<div>--- Event Stage 0: Main Stage</div>
<div><br>
</div>
<div> Container 2 1</div>
<div> Distributed Mesh 5 3</div>
<div> DM Label 20 8</div>
<div> Index Set 64 52</div>
<div> Section 17 12</div>
<div> Star Forest Graph 10 7</div>
<div> Discrete System 7 5</div>
<div> Weak Form 7 5</div>
<div> GraphPartitioner 3 2</div>
<div> Matrix 2 1</div>
<div> Vector 1 0</div>
<div> Viewer 2 1</div>
<div>========================================================================================================================</div>
<div>Average time to get PetscTime(): 1.8e-08</div>
<div>#PETSc Option Table entries:</div>
<div>-dm_view vtk:./ksp_data/abc.vtk # (source: command line)</div>
<div>-log_view # (source: command line)</div>
<div>#End of PETSc Option Table entries</div>
<div>Compiled without FORTRAN kernels</div>
<div>Compiled with full precision matrices (default)</div>
<div>sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4</div>
<div>Configure options: --with-debugging=0 --with-strict-petscerrorcode --download-openmpi --download-metis --download-exodusii --download-parmetis --download-netcdf --download-pnetcdf --download-hdf5 --download-zlib --download-superlu
--download-superlu_dist --download-triangle --download-cmake --download-fblaslapack --download-slepc</div>
<div>-----------------------------------------</div>
<div>Libraries compiled on 2023-09-15 02:34:25 on DESKTOP-0H8HCOD</div>
<div>Machine characteristics: Linux-5.10.16.3-microsoft-standard-WSL2-x86_64-with-glibc2.29</div>
<div>Using PETSc directory: /home/qingfeng/petsc/optpetsc3-19-5/petsc</div>
<div>Using PETSc arch: arch-linux-c-opt</div>
<div>-----------------------------------------</div>
<div><br>
</div>
<div>Using C compiler: /home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector -fvisibility=hidden -g -O</div>
<div>Using Fortran compiler: /home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/bin/mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O</div>
<div><br>
</div>
<div>-----------------------------------------</div>
<div><br>
</div>
<div>Using include paths: -I/home/qingfeng/petsc/optpetsc3-19-5/petsc/include -I/home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/include</div>
<div>-----------------------------------------</div>
<div><br>
</div>
<div>Using C linker: /home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/bin/mpicc</div>
<div>Using Fortran linker: /home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/bin/mpif90</div>
<div>Using libraries: -Wl,-rpath,/home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/lib -L/home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/lib -lpetsc -Wl,-rpath,/home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/lib
-L/home/qingfeng/petsc/optpetsc3-19-5/petsc/arch-linux-c-opt/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 -L/usr/lib/gcc/x86_64-linux-gnu/9 -lsuperlu -lsuperlu_dist -lflapack -lfblas -lexoIIv2for32 -lexodus -lnetcdf -lpnetcdf -lhdf5_hl -lhdf5 -lparmetis
-lmetis -ltriangle -lm -lz -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl</div>
-----------------------------------------<br>
</div>
<div><br>
</div>
<div>The mesh information:</div>
<div>DM Object: Created by ICEMCFD - EXODUS II Interface 1 MPI process
<div> type: plex</div>
<div>Created by ICEMCFD - EXODUS II Interface in 3 dimensions:</div>
<div> Number of 0-cells per rank: 134549</div>
<div> Number of 1-cells per rank: 841756</div>
<div> Number of 2-cells per rank: 1366008</div>
<div> Number of 3-cells per rank: 658801</div>
<div>Labels:</div>
<div> celltype: 4 strata with value/size (0 (134549), 6 (658801), 3 (1366008), 1 (841756))</div>
<div> depth: 4 strata with value/size (0 (134549), 1 (841756), 2 (1366008), 3 (658801))</div>
<div> Cell Sets: 2 strata with value/size (1 (604426), 2 (54375))</div>
<div> Vertex Sets: 5 strata with value/size (3 (481), 4 (27248), 5 (20560), 6 (653), 7 (2370))</div>
<div> Face Sets: 5 strata with value/size (8 (740), 9 (54206), 10 (40857), 11 (999), 12 (4534))</div>
<div> SMALLER: 1 strata with value/size (8 (740))</div>
<div> OUTER: 1 strata with value/size (9 (54206))</div>
<div> INNER: 1 strata with value/size (10 (40857))</div>
BIGGER: 1 strata with value/size (11 (999))<br>
</div>
<div><br>
</div>
</div>
</div>
</div></blockquote></div><br clear="all"><div><br></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>