[petsc-users] Can PETSc support parallelly import HDF5 mesh file?
Matthew Knepley
knepley at gmail.com
Wed May 28 11:56:03 CDT 2025
On Wed, May 28, 2025 at 12:38 PM 程奔 via petsc-users <petsc-users at mcs.anl.gov>
wrote:
> Hello,
>
> Recently I create unstructure mesh from Gmsh and its mesh format is msh
> file. However the mesh file contain around 100 million nodes, so when I use *DMPlexCreateFromFile
> <https://urldefense.us/v3/__https://petsc.org/release/manualpages/DMPlex/DMPlexCreateFromFile/__;!!G_uCfscf7eWS!b8tabJOQMBPGdL853VyP1yu-DrBuHaOQDcvf368RW41TmzhfdTRagLyrwBnZBeZ6OM3g6neOJBuBS2RCNnBwe35XChWNH5S3bg$> *
> it only perform on a single CPU process thus out of memory.
>
> I have sent email to report this situation to your guys and you told me
> that I can use h5 mesh file. Thanks for your advise so I try to generate h5
> file.
>
> In order to load the very large msh, I first use a single computer node(64 CPU
> process) that with large CPU memory. I just use one CPU process to
> perform the code to generate h5 mesh file:
>
>
> -----------------------------------------------------------------------------------------------
>
>
> //Set up or input the mesh
>
> ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD, "mesh.msh","hjtest"
> ,PETSC_TRUE, &dm_nodes);CHKERRQ(ierr);
>
>
> //Distribute the mesh into different processers
> DM dm_nodes_Dist; //the dm object using to contain the distributed mesh
> temporarily
> PetscPartitioner part; //for distributing the mesh
> ierr = DMPlexGetPartitioner(dm_nodes, &part);CHKERRQ(ierr); //Get the
> partitioner of the mesh we just created
> ierr =
> PetscPartitionerSetType(part,PETSCPARTITIONERPARMETIS);CHKERRQ(ierr); //Set
> the partitioner from the commond line option
> ierr = DMPlexDistribute(dm_nodes,0,NULL,&dm_nodes_Dist); CHKERRQ(ierr);
> //distribute the mesh into different processers
> if (dm_nodes_Dist) {DMDestroy(&dm_nodes); dm_nodes = dm_nodes_Dist;}
> //delete the origin mesh and use the distributed one
>
This distribution is unnecessary.
>
> PetscViewerHDF5Open(PETSC_COMM_WORLD, "mesh.h5", FILE_MODE_WRITE, &viewer);
> DMView(dm_nodes, viewer);
> PetscViewerDestroy(&viewer);
>
It would be easier to use
PetscCall(DMViewFromOptions(dm_nodes, NULL, "-dm_view"));
so that we could easily customize things. Here you will need to write the
file in an updated format
-dm_view hdf5:mesh.h5
-dm_plex_view_hdf5_storage_version 3.1.0
>
> DMDestroy(&dm_nodes);
>
> -----------------------------------------------------------------------------------------------
>
>
> I get the large h5 mesh, then I want to use another computer(equiped with
> 20 computer node 1280 CPU process that have not so large CPU memory) to
> perform the parallel computation. So I perform the code that use 20
> computer node 1280 CPU process:
>
>
>
> *************************************************************************************************
>
> ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "mesh.h5", FILE_MODE_READ,
> &viewer); CHKERRQ(ierr);
> ierr = PetscViewerSetFormat(viewer, PETSC_VIEWER_HDF5); CHKERRQ(ierr);
>
>
>
> ierr = DMPlexCreate(PETSC_COMM_WORLD, &dm_nodes); CHKERRQ(ierr);
> ierr = DMSetType(dm_nodes, DMPLEX); CHKERRQ(ierr);
>
>
>
>
> PetscPartitioner part;
> ierr = DMPlexGetPartitioner(dm_nodes, &part); CHKERRQ(ierr);
> ierr = PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS);
> CHKERRQ(ierr);
>
>
> ierr = DMLoad(dm_nodes, viewer); CHKERRQ(ierr);
> ierr = PetscViewerDestroy(&viewer); CHKERRQ(ierr);
>
I think it is simpler to just use
PetscCall(DMCreate(comm, &dm));
PetscCall(DMSetType(dm, DMPLEX));
PetscCall(DMSetFromOptions(dm));
and then use
-dm_plex_filename mesh.h5
to load it.
Thanks,
Matt
> PetscPrintf(PETSC_COMM_WORLD,"# Loaded mesh from HDF5\n");
>
>
> *************************************************************************************************
>
>
>
> I don't know if in this way, the PETSc can load h5 mesh file parallelly so
> that it will not only perform on a single CPU process thus out of memory.
> I try several times but it seems that it always load the h5 file on the one
> CPU process, I don't know if it have some mistakes or something.
>
>
> And I use the version of PETSc is 3.21.6 and the configure is attached, I
> do download the hdf5.
>
>
> So I write this email to ask for the help.
>
>
> Looking forward to your reply!
>
> sinserely,
> Cheng.
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YotTbyod-NJW43FaXi_Oc9Cqlb1kMpzfZv43sF-b6wDFrCR7USjxhBX_7BAwH2hQ-QuGQXxSBrkUJ67YeQ7C$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YotTbyod-NJW43FaXi_Oc9Cqlb1kMpzfZv43sF-b6wDFrCR7USjxhBX_7BAwH2hQ-QuGQXxSBrkUJ6enKDLL$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250528/800c5b3b/attachment.html>
More information about the petsc-users
mailing list