[petsc-users] Can PETSc support parallelly import HDF5 mesh file?

程奔 ctchengben at mail.scut.edu.cn
Wed May 28 11:36:47 CDT 2025


Hello,

Recently I create unstructure mesh from Gmsh and its mesh format is msh file. However the mesh file contain around 100 million nodes, so when I use DMPlexCreateFromFile 
it only perform on a single CPU process thus out of memory. 


I have sent email to report this situation to your guys and you told me that I can use h5 mesh file. Thanks for your advise so I try to generate h5 file.


In order to load the very large msh, I first use a single computer node(64 CPU process) that with large CPU memory. I just use one CPU process to perform the code to generate h5 mesh file:




-----------------------------------------------------------------------------------------------

//Set up or input the mesh

ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD, "mesh.msh","hjtest" ,PETSC_TRUE, &dm_nodes);CHKERRQ(ierr);


//Distribute the mesh into different processers
DM dm_nodes_Dist; //the dm object using to contain the distributed mesh temporarily 
PetscPartitioner part; //for distributing the mesh
ierr = DMPlexGetPartitioner(dm_nodes, &part);CHKERRQ(ierr); //Get the partitioner of the mesh we just created
ierr = PetscPartitionerSetType(part,PETSCPARTITIONERPARMETIS);CHKERRQ(ierr); //Set the partitioner from the commond line option
ierr = DMPlexDistribute(dm_nodes,0,NULL,&dm_nodes_Dist); CHKERRQ(ierr); //distribute the mesh into different processers
if (dm_nodes_Dist) {DMDestroy(&dm_nodes); dm_nodes = dm_nodes_Dist;} //delete the origin mesh and use the distributed one



PetscViewerHDF5Open(PETSC_COMM_WORLD, "mesh.h5", FILE_MODE_WRITE, &viewer);
DMView(dm_nodes, viewer);
PetscViewerDestroy(&viewer);


DMDestroy(&dm_nodes);
-----------------------------------------------------------------------------------------------




I get the large h5 mesh, then I want to use another computer(equiped with 20 computer node 1280 CPU process that have not so large CPU memory) to perform the parallel computation. So I perform the code that use 20 computer node 1280 CPU process:




*************************************************************************************************

ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "mesh.h5", FILE_MODE_READ, &viewer); CHKERRQ(ierr);

ierr = PetscViewerSetFormat(viewer, PETSC_VIEWER_HDF5); CHKERRQ(ierr); 



ierr = DMPlexCreate(PETSC_COMM_WORLD, &dm_nodes); CHKERRQ(ierr);
ierr = DMSetType(dm_nodes, DMPLEX); CHKERRQ(ierr);




PetscPartitioner part;
ierr = DMPlexGetPartitioner(dm_nodes, &part); CHKERRQ(ierr);
ierr = PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS); CHKERRQ(ierr);


ierr = DMLoad(dm_nodes, viewer); CHKERRQ(ierr);
ierr = PetscViewerDestroy(&viewer); CHKERRQ(ierr);
PetscPrintf(PETSC_COMM_WORLD,"# Loaded mesh from HDF5\n");

*************************************************************************************************








I don't know if in this way, the PETSc can load h5 mesh file parallelly so that it will not only perform on a single CPU process thus out of memory. I try several times but it seems that it always load the h5 file on the one CPU process, I don't know if it have some mistakes or something.




And I use the version of PETSc is 3.21.6 and the configure is attached, I do download the hdf5.




So I write this email to ask for the help.




Looking forward to your reply!


sinserely,
Cheng.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250529/55f381f1/attachment-0001.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: configure.log
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250529/55f381f1/attachment-0001.ksh>


More information about the petsc-users mailing list