[petsc-dev] DMPlex VecView to HDF5

Matthew Knepley knepley at gmail.com
Mon Feb 19 17:32:57 CST 2018


On Mon, Feb 19, 2018 at 4:33 PM, Adrian Croucher <a.croucher at auckland.ac.nz>
wrote:

> hi Matt
>
> I tried what you suggested and modified DMPlexGetFieldType_Internal() so
> that all processes have the same field type (ft):
>
> Yep, this is going to take specifying in the interface that it is
collective. It looks like it might be used as if it is not. I will get to it
as soon as I can.

  Thanks,

     Matt

> PetscErrorCode DMPlexGetFieldType_Internal(DM dm, PetscSection section,
> PetscInt field, PetscInt *sStart, PetscInt *sEnd, PetscViewerVTKFieldType
> *ft)
> {
>   PetscInt       dim, pStart, pEnd, vStart, vEnd, cStart, cEnd,
> cEndInterior;
>   PetscInt       vcdof[2] = {0,0}, globalvcdof[2];
>   PetscErrorCode ierr;
>
>   PetscFunctionBegin;
>   *ft  = PETSC_VTK_POINT_FIELD;
>   ierr = DMGetDimension(dm, &dim);CHKERRQ(ierr);
>   ierr = DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);CHKERRQ(ierr);
>   ierr = DMPlexGetHeightStratum(dm, 0, &cStart, &cEnd);CHKERRQ(ierr);
>   ierr = DMPlexGetHybridBounds(dm, &cEndInterior, NULL, NULL,
> NULL);CHKERRQ(ierr);
>   cEnd = cEndInterior < 0 ? cEnd : cEndInterior;
>   ierr = PetscSectionGetChart(section, &pStart, &pEnd);CHKERRQ(ierr);
>   if (field >= 0) {
>     if ((vStart >= pStart) && (vStart < pEnd)) {ierr =
> PetscSectionGetFieldDof(section, vStart, field, &vcdof[0]);CHKERRQ(ierr);}
>     if ((cStart >= pStart) && (cStart < pEnd)) {ierr =
> PetscSectionGetFieldDof(section, cStart, field, &vcdof[1]);CHKERRQ(ierr);}
>   } else {
>     if ((vStart >= pStart) && (vStart < pEnd)) {ierr =
> PetscSectionGetDof(section, vStart, &vcdof[0]);CHKERRQ(ierr);}
>     if ((cStart >= pStart) && (cStart < pEnd)) {ierr =
> PetscSectionGetDof(section, cStart, &vcdof[1]);CHKERRQ(ierr);}
>   }
>   ierr = MPI_Allreduce(&vcdof, &globalvcdof, 2, MPIU_INT, MPI_MAX,
> PetscObjectComm((PetscObject)dm));CHKERRQ(ierr);
>   if (globalvcdof[0]) {
>     *sStart = vStart;
>     *sEnd   = vEnd;
>     if (globalvcdof[0] == dim) *ft = PETSC_VTK_POINT_VECTOR_FIELD;
>     else             *ft = PETSC_VTK_POINT_FIELD;
>   } else if (globalvcdof[1]) {
>     *sStart = cStart;
>     *sEnd   = cEnd;
>     if (globalvcdof[1] == dim) *ft = PETSC_VTK_CELL_VECTOR_FIELD;
>     else             *ft = PETSC_VTK_CELL_FIELD;
>   } else SETERRQ(PetscObjectComm((PetscObject) dm), PETSC_ERR_ARG_WRONG,
> "Could not classify input Vec for VTK");
>   PetscFunctionReturn(0);
> }
>
> This succeeded in preventing the error I was originally getting. However,
> it just crashes further down the line. The error message is below- any
> suggestions on what else might need changing to fix this?
>
> Cheers, Adrian
>
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: MPI_Allreduce() called in different locations (code lines)
> on different processors
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2236-gcb4b83bf76
> GIT Date: 2018-02-13 12:48:33 -0700
> [0]PETSC ERROR: waiwera on a linux-gnu-c-opt named en-354401 by acro018
> Tue Feb 20 10:18:55 2018
> [0]PETSC ERROR: Configure options --with-x --download-hdf5
> --download-netcdf --download-pnetcdf --download-exodusii
> --download-triangle --download-ptscotch --download-chaco --download-hypre
> [0]PETSC ERROR: #1 VecSetBlockSize() line 1351 in
> /home/acro018/software/PETSc/code/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: #2 VecSetBlockSize() line 1351 in
> /home/acro018/software/PETSc/code/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: #3 VecCreateMPIWithArray() line 610 in
> /home/acro018/software/PETSc/code/src/vec/vec/impls/mpi/pbvec.c
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Petsc has generated inconsistent data
> [1]PETSC ERROR: MPI_Allreduce() called in different locations (code lines)
> on different processors
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2236-gcb4b83bf76
> GIT Date: 2018-02-13 12:48:33 -0700
> [1]PETSC ERROR: waiwera on a linux-gnu-c-opt named en-354401 by acro018
> Tue Feb 20 10:18:55 2018
> [1]PETSC ERROR: Configure options --with-x --download-hdf5
> --download-netcdf --download-pnetcdf --download-exodusii
> --download-triangle --download-ptscotch --download-chaco --download-hypre
> [1]PETSC ERROR: #1 PetscSplitOwnership() line 88 in
> /home/acro018/software/PETSc/code/src/sys/utils/psplit.c
> [1]PETSC ERROR: #2 PetscSplitOwnership() line 88 in
> /home/acro018/software/PETSc/code/src/sys/utils/psplit.c
> [1]PETSC ERROR: #3 PetscLayoutSetUp() line 137 in
> /home/acro018/software/PETSc/code/src/vec/is/utils/pmap.c
> [2]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [2]PETSC ERROR: Petsc has generated inconsistent data
> [2]PETSC ERROR: MPI_Allreduce() called in different locations (code lines)
> on different processors
> [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [2]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2236-gcb4b83bf76
> GIT Date: 2018-02-13 12:48:33 -0700
> [2]PETSC ERROR: waiwera on a linux-gnu-c-opt named en-354401 by acro018
> Tue Feb 20 10:18:55 2018
> [2]PETSC ERROR: Configure options --with-x --download-hdf5
> --download-netcdf --download-pnetcdf --download-exodusii
> --download-triangle --download-ptscotch --download-chaco --download-hypre
> [2]PETSC ERROR: #1 PetscSplitOwnership() line 88 in
> /home/acro018/software/PETSc/code/src/sys/utils/psplit.c
> [2]PETSC ERROR: #2 PetscSplitOwnership() line 88 in
> /home/acro018/software/PETSc/code/src/sys/utils/psplit.c
> [2]PETSC ERROR: #3 PetscLayoutSetUp() line 137 in
> /home/acro018/software/PETSc/code/src/vec/is/utils/pmap.c
> [2]PETSC ERROR: #4 VecCreate_MPI_Private() line 489 in
> /home/acro018/software/PETSc/code/src/vec/vec/impls/mpi/pbvec.c
> [2]PETSC ERROR: #5 VecCreateMPIWithArray() line 611 in
> /home/acro018/software/PETSc/code/src/vec/vec/impls/mpi/pbvec.c
> [2]PETSC ERROR: #6 VecGetSubVector() line 1301 in
> /home/acro018/software/PETSc/code/src/vec/vec/interface/rvector.c
> [2]PETSC ERROR: #7 PetscSectionGetField_Internal() line 276 in
> /home/acro018/software/PETSc/code/src/vec/vec/utils/vsection.c
> #4 VecGetSubVector() line 1301 in /home/acro018/software/PETSc/
> code/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: #5 PetscSectionGetField_Internal() line 276 in
> /home/acro018/software/PETSc/code/src/vec/vec/utils/vsection.c
> [0]PETSC ERROR: #6 VecView_Plex_Local_HDF5_Internal() line 169 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [0]PETSC ERROR: #7 VecView_Plex_Local() line 306 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: #8 VecView_Plex_HDF5_Internal() line 214 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [0]PETSC ERROR: #9 VecView_Plex() line 354 in /home/acro018/software/PETSc/
> code/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: #10 VecView() line 586 in /home/acro018/software/PETSc/
> code/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: [1]PETSC ERROR: #4 VecCreate_MPI_Private() line 489 in
> /home/acro018/software/PETSc/code/src/vec/vec/impls/mpi/pbvec.c
> [1]PETSC ERROR: #5 VecCreateMPIWithArray() line 611 in
> /home/acro018/software/PETSc/code/src/vec/vec/impls/mpi/pbvec.c
> [1]PETSC ERROR: #6 VecGetSubVector() line 1301 in
> /home/acro018/software/PETSc/code/src/vec/vec/interface/rvector.c
> [1]PETSC ERROR: #7 PetscSectionGetField_Internal() line 276 in
> /home/acro018/software/PETSc/code/src/vec/vec/utils/vsection.c
> [1]PETSC ERROR: #8 VecView_Plex_Local_HDF5_Internal() line 169 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [1]PETSC ERROR: #9 VecView_Plex_Local() line 306 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plex.c
> [2]PETSC ERROR: #8 VecView_Plex_Local_HDF5_Internal() line 169 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [2]PETSC ERROR: #9 VecView_Plex_Local() line 306 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plex.c
> [2]PETSC ERROR: #10 VecView_Plex_HDF5_Internal() line 214 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [2]PETSC ERROR: #11 VecView_Plex() line 354 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plex.c
> [2]PETSC ERROR: #12 VecView() line 586 in /home/acro018/software/PETSc/
> code/src/vec/vec/interface/vector.c
> [2]PETSC ERROR: #13 User provided function() line 0 in User file
> [1]PETSC ERROR: #10 VecView_Plex_HDF5_Internal() line 214 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plexhdf5.c
> [1]PETSC ERROR: #11 VecView_Plex() line 354 in
> /home/acro018/software/PETSc/code/src/dm/impls/plex/plex.c
> [1]PETSC ERROR: #12 VecView() line 586 in /home/acro018/software/PETSc/
> code/src/vec/vec/interface/vector.c
> [1]PETSC ERROR: #13 User provided function() line 0 in User file
> #11 User provided function() line 0 in User file
>
>
> On 13/02/18 17:13, Matthew Knepley wrote:
>
> On Mon, Feb 12, 2018 at 10:57 PM, Adrian Croucher <
> a.croucher at auckland.ac.nz> wrote:
>
>> hi,
>>
>> I have a DMPlex which I pass in to DMCreateGlobalVector() to create a
>> vector, and then use VecView() to view the vector.
>>
>> If the DM happens to contain no points on one process, this still works
>> fine if I view the vector using PETSC_VIEWER_STDOUT_WORLD.
>>
>> However if I create an HDF5 viewer using PetscViewerHDF5Open() and try to
>> view the vector using that, it crashes with the error message "Could not
>> classify input Vec for VTK".
>>
>> It goes into VecView_Plex_Local_HDF5_Internal(), which calls
>> DMPlexGetFieldType_Internal() (plex.c:93). This has some conditionals
>> testing if vStart < pEnd or cStart < pEnd, which are never satisfied
>> because pEnd = 0. This is what leads to it raising the error.
>>
>> This function returns the point range sStart, sEnd and field type ft. To
>> handle the case where there are no points on the process, should it perhaps
>> just return sStart, sEnd = 0,0 and a default (or maybe null) value for ft?
>
>
> Crap, this is a problem. I will have to put some parallel code in there
> because everyone must agree on the ft.
>
>    Matt
>
>
>
> --
> Dr Adrian Croucher
> Senior Research Fellow
> Department of Engineering Science
> University of Auckland, New Zealand
> email: a.croucher at auckland.ac.nz
> tel: +64 (0)9 923 4611 <+64%209-923%204611>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180219/ce4aba1d/attachment.html>


More information about the petsc-dev mailing list