[petsc-users] DMView and DMLoad
Matthew Knepley
knepley at gmail.com
Thu Apr 2 12:03:53 CDT 2015
I have fixed this bug. The fix is merged to next, and will make it to
master soon.
Thanks,
Matt
On Tue, Mar 10, 2015 at 5:05 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Mar 10, 2015 at 5:02 PM, Ataollah Mesgarnejad <
> amesga1 at tigers.lsu.edu> wrote:
>
>> Thanks Matt,
>>
>> I changed your file to reproduce the error. If you run it on more than
>> one processor you will get something similar to:
>>
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>>
>> [0]PETSC ERROR: Invalid argument
>>
>> [0]PETSC ERROR: Number of coordinates loaded 9 does not match number of
>> vertices 13
>>
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>>
>> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2186-g052b31f GIT
>> Date: 2015-03-04 11:15:59 -0600
>>
>> [0]PETSC ERROR: /work/01624/amesga/Code/dmcomplex/ex5 on a
>> sandybridge-cxx-dbg named c560-303.stampede.tacc.utexas.edu by amesga
>> Tue Mar 10 16:57:01 2015
>>
>> [0]PETSC ERROR: Configure options --COPTFLAGS= --CXXOPTFLAGS=
>> --FOPTFLAGS= --download-chaco=1 --download-exodusii=1 --download-hdf5=1
>> --download-metis=1 --download-netcdf=1 --download-parmetis=1 --with-blacs=1
>> --with-blas-lapack-dir=/opt/apps/intel/13/composer_xe_2013_sp1.1.106/mkl/lib/intel64
>> --with-clanguage=C++ --with-debugging=yes --with-hdf5=1 --with-metis=1
>> --with-valgrind=1 --download-triangle=1
>> --with-valgrind-dir=/opt/apps/valgrind/3.8.1/ --with-mpi-compilers=1
>> --with-mpi-dir=/opt/apps/intel13/impi/4.1.3.049/intel64
>> --with-scalar-type=real --with-shared-libraries=1
>> --with-vendor-compilers=intel --with-x11=1 -with-pic
>> PETSC_ARCH=sandybridge-cxx-dbg
>>
>> [0]PETSC ERROR: #1 DMPlexLoad_HDF5() line 741 in
>> /work/01624/amesga/Software/petsc/src/dm/impls/plex/plexhdf5.c
>>
>> [0]PETSC ERROR: #2 DMLoad_Plex() line 605 in
>> /work/01624/amesga/Software/petsc/src/dm/impls/plex/plex.c
>>
>> [0]PETSC ERROR: #3 DMLoad() line 2939 in
>> /work/01624/amesga/Software/petsc/src/dm/interface/dm.c
>>
>> [0]PETSC ERROR: #4 main() line 31 in
>> /work/01624/amesga/Code/dmcomplex/ex5.c
>>
>> [0]PETSC ERROR: No PETSc Option Table entries
>>
>> [0]PETSC ERROR: ----------------End of Error Message -------
>>
>>
>> It seems like HDF5 version of DMView and DMLoad work fine for a serial
>> DMPlex but DMLoad breaks in parallel.
>>
>
> Cool, will get it fixed. I have only been loading Vecs since DMLoad()
> cannot recreate hierarchies, and thus is not
> compatible with the FAS tests I am mostly running.
>
> Thanks,
>
> Matt
>
>
>> Many thanks,
>>
>> Ata
>>
>> On Tue, Mar 10, 2015 at 3:15 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Mon, Mar 9, 2015 at 11:28 AM, Ataollah Mesgarnejad <
>>> amesga1 at tigers.lsu.edu> wrote:
>>>
>>>> Dear all,
>>>>
>>>> I'm trying to save and load distributed DMPlex in HDF5 format with
>>>> DMView and DMLoad. The problem is even though there are no errors while
>>>> loading the DM when I look at the DM loaded from file is not equal to the
>>>> DM that was saved. Also, when loading the DM if DMSetType is called before
>>>> the DMLoad, DMLoad produces errors (like line 316
>>>> <https://bitbucket.org/petsc/petsc/src/236f5a4d227fe9d0affddb8701edb9509ad39525/src/snes/examples/tutorials/ex12.c?at=master#cl-316>
>>>> in src/snes/examples/tutorials/ex12.c ).
>>>>
>>>> I'm including a small code to replicate the problem here plus an input
>>>> mesh file.
>>>>
>>>
>>> Here is a simpler test, which I am going to make DMPlex ex5. The file
>>> you sent does not make sense
>>> to me since it has cells with only two vertices. Run it with
>>>
>>> -dm_view ::ascii_info_detail
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>> Best,
>>>> Ata
>>>>
>>>> /**
>>>>
>>>> * @file
>>>>
>>>> *
>>>>
>>>> * Created by Ataollah Mesgarnejad on 3/6/15.
>>>>
>>>> *
>>>>
>>>> * This is a test to read and write a distributed DM.
>>>>
>>>> *
>>>>
>>>> */
>>>>
>>>>
>>>>
>>>> static char help[] = "An example of the usage of PetscSF to scatter
>>>> data back and forth between a \
>>>>
>>>> a serial DM and its parallel counterpart.\n";
>>>>
>>>>
>>>> #include <petscdmplex.h>
>>>>
>>>> #include <petscsnes.h>
>>>>
>>>> #include <petscsf.h>
>>>>
>>>> #include <exodusII.h>
>>>>
>>>> #include <petsc-private/dmimpl.h> /*I "petscdm.h" I*/
>>>>
>>>> #include <petscsf.h>
>>>>
>>>> #include <petscviewerhdf5.h>
>>>>
>>>>
>>>> #undef __FUNCT__
>>>>
>>>> #define __FUNCT__ "main"
>>>>
>>>> int main(int argc,char **argv)
>>>>
>>>> {
>>>>
>>>> PetscErrorCode ierr;
>>>>
>>>> DM dm, distDM,distDMold;
>>>>
>>>> char ifilename[PETSC_MAX_PATH_LEN];
>>>>
>>>> PetscBool flg;
>>>>
>>>> int CPU_word_size = 0,IO_word_size = 0,exoid;
>>>>
>>>> float version;
>>>>
>>>> PetscInt numproc,rank;
>>>>
>>>> PetscViewer stdoutViewer,DMPlexHDF5View;
>>>>
>>>> PetscReal *VArray;
>>>>
>>>> PetscInt dim;
>>>>
>>>> PetscInt numFields = 2;
>>>>
>>>> PetscInt numComp[2] = {1,1};
>>>>
>>>> PetscInt numDof[6] = {1, 0, 0,
>>>>
>>>> 0, 0, 1}; /*{Vertex, Edge, Cell} */
>>>>
>>>> PetscInt bcFields[1] = {0}, numBC=0;
>>>>
>>>> //PetscInt *remoteOffsets;
>>>>
>>>> char** namelist;
>>>>
>>>> PetscInt off;
>>>>
>>>> IS bcPoints[1] = {NULL};
>>>>
>>>> IS *ISlist;
>>>>
>>>> PetscSection seqSection, distSection;
>>>>
>>>> Vec distV, V;
>>>>
>>>> PetscSF pointSF;//,pointSFold;
>>>>
>>>> PetscInt pStart, pEnd, dof;
>>>>
>>>> PetscBool load_files = PETSC_FALSE, save_files =
>>>> PETSC_FALSE;
>>>>
>>>> MPI_Comm comm;
>>>>
>>>>
>>>>
>>>> ierr = PetscInitialize(&argc, &argv, (char*)0, help);CHKERRQ(ierr);
>>>>
>>>> ierr = PetscViewerASCIIGetStdout(PETSC_COMM_SELF,&stdoutViewer);
>>>> CHKERRQ(ierr);
>>>>
>>>> comm = PETSC_COMM_WORLD;
>>>>
>>>> // PetscErrorCode DMCreateSubDM(DM dm, PetscInt numFields,
>>>> PetscInt fields[], IS *is, DM *subdm)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> ierr = PetscOptionsBegin(PETSC_COMM_WORLD,"","PetscSF Test Options"
>>>> ,"none");CHKERRQ(ierr);
>>>>
>>>> ierr = PetscOptionsBool("-save_files","save the distributed vector
>>>> in HDF5 format","",PETSC_FALSE,&save_files,NULL);CHKERRQ(ierr);
>>>>
>>>> ierr = PetscOptionsBool("-load_files","Load the distributed vector
>>>> in HDF5 format","",PETSC_FALSE,&load_files,NULL);CHKERRQ(ierr);
>>>>
>>>> ierr = PetscOptionsEnd();
>>>>
>>>> ierr = PetscOptionsGetString(PETSC_NULL,"-i",ifilename,sizeof
>>>> ifilename,&flg);CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = MPI_Comm_size(PETSC_COMM_WORLD,&numproc);
>>>>
>>>> ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);
>>>>
>>>> if (!rank)
>>>>
>>>> {
>>>>
>>>> exoid = ex_open(ifilename,EX_READ
>>>> ,&CPU_word_size,&IO_word_size,&version);
>>>>
>>>> if (exoid <= 0) SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_LIB,"ex_open(\"%s\",...)
>>>> did not return a valid file ID",ifilename);
>>>>
>>>> }
>>>>
>>>> else
>>>>
>>>> {
>>>>
>>>> exoid = -1; /* Not used */
>>>>
>>>> }
>>>>
>>>> ierr = DMPlexCreateExodus(PETSC_COMM_WORLD,exoid,PETSC_FALSE,&dm);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = PetscObjectSetName((PetscObject) dm,"sequential-DM");
>>>>
>>>> ierr = DMGetDimension(dm, &dim);
>>>> CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> if (numproc < 2 ) distDM = dm;
>>>>
>>>> else ierr = DMPlexDistribute(dm, 0, &pointSF, &distDM);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = PetscObjectSetName((PetscObject) distDM,"Distributed-DM");
>>>>
>>>>
>>>> ierr = DMPlexCreateSection(dm, dim, numFields, numComp,
>>>> numDof,
>>>>
>>>> numBC, bcFields, bcPoints,PETSC_NULL,
>>>> &seqSection); CHKERRQ(ierr);
>>>>
>>>> ierr = DMPlexCreateSection(distDM, dim, numFields, numComp,
>>>> numDof,
>>>>
>>>> numBC, bcFields, bcPoints,PETSC_NULL,
>>>> &distSection); CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = DMSetDefaultSection(dm, seqSection);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMSetDefaultSection(distDM, distSection);
>>>> CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = DMCreateGlobalVector(dm, &V);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = VecCreate(comm, &distV);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = PetscObjectSetName((PetscObject) distV,"Distributed-V");
>>>>
>>>>
>>>> ierr = VecGetArray(V, &VArray);
>>>> CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> // fill vertex data
>>>>
>>>> ierr = DMPlexGetDepthStratum(dm, 0, &pStart, &pEnd);
>>>> CHKERRQ(ierr);
>>>>
>>>> for (int p = pStart; p < pEnd; p++)
>>>>
>>>> {
>>>>
>>>> ierr = PetscSectionGetDof(seqSection, p, &dof);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = PetscSectionGetOffset(seqSection, p, &off);
>>>> CHKERRQ(ierr);
>>>>
>>>> for (int d = 0; d < dof; d++)
>>>>
>>>> {
>>>>
>>>> VArray[off + d] = p;
>>>>
>>>> }
>>>>
>>>> }
>>>>
>>>>
>>>>
>>>> // fill cell data
>>>>
>>>> ierr = DMPlexGetDepthStratum(dm, 1, &pStart, &pEnd);
>>>> CHKERRQ(ierr);
>>>>
>>>> for (int p = pStart; p < pEnd; p++)
>>>>
>>>> {
>>>>
>>>> ierr = PetscSectionGetDof(seqSection, p, &dof);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = PetscSectionGetOffset(seqSection, p, &off);
>>>> CHKERRQ(ierr);
>>>>
>>>> for (int d = 0; d < dof; d++)
>>>>
>>>> {
>>>>
>>>> VArray[off + d] = -p-1;
>>>>
>>>> }
>>>>
>>>> }
>>>>
>>>>
>>>>
>>>> ierr = VecRestoreArray(V, &VArray); CHKERRQ(ierr);
>>>>
>>>> ierr = DMCreateFieldIS(dm, &numFields, &namelist, &ISlist);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMPlexDistributeField(dm, pointSF, seqSection, V,
>>>> distSection, distV); CHKERRQ
>>>> (ierr);
>>>>
>>>> #if defined(PETSC_HAVE_HDF5)
>>>>
>>>> if (save_files)
>>>>
>>>> {
>>>>
>>>> // distribute fields
>>>>
>>>>
>>>>
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD, "==== Distrubuted DM\n");
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMView(distDM, PETSC_VIEWER_STDOUT_WORLD);
>>>> CHKERRQ(ierr);
>>>>
>>>> PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT);
>>>>
>>>> // Save distributed data
>>>>
>>>>
>>>>
>>>> ierr = PetscViewerHDF5Open(PetscObjectComm((PetscObject)
>>>> distDM),"distDM.h5", FILE_MODE_WRITE, &DMPlexHDF5View);CHKERRQ(ierr);
>>>>
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"Writing dist DM ...\n");
>>>>
>>>> PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT);
>>>>
>>>> ierr = DMView(distDM, DMPlexHDF5View);
>>>> CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> PetscViewerDestroy(&DMPlexHDF5View);
>>>>
>>>>
>>>>
>>>> }
>>>>
>>>> else if (load_files)
>>>>
>>>> {
>>>>
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"Loading dist vectors
>>>> ...\n"); CHKERRQ(ierr);
>>>>
>>>> ierr = PetscViewerHDF5Open(PETSC_COMM_SELF,"distDM.h5",
>>>> FILE_MODE_READ, &DMPlexHDF5View); CHKERRQ(ierr);
>>>>
>>>> ierr = DMCreate(comm,&distDMold);
>>>> CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = DMLoad(distDMold,DMPlexHDF5View);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMSetType(distDMold, DMPLEX);
>>>> CHKERRQ(ierr);
>>>>
>>>> /*ierr = DMPlexCreateSection(distDMold, dim, numFields,
>>>> numComp, numDof,
>>>>
>>>> numBC, bcFields,
>>>> bcPoints,PETSC_NULL, &distSectionold);
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMSetDefaultSection(distDMold,
>>>> distSectionold);CHKERRQ(ierr);
>>>>
>>>> ierr = DMGetPointSF(distDMold,&pointSFold);
>>>> CHKERRQ(ierr);*/
>>>>
>>>> PetscViewerDestroy(&DMPlexHDF5View);
>>>>
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD, "==== Read DM\n");
>>>> CHKERRQ(ierr);
>>>>
>>>> ierr = DMView(distDMold, PETSC_VIEWER_STDOUT_WORLD);
>>>> CHKERRQ(ierr);
>>>>
>>>> PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT);
>>>> CHKERRQ(ierr);
>>>>
>>>> DMPlexEqual(distDM, distDMold, &flg);
>>>>
>>>> if (flg)
>>>>
>>>> {
>>>>
>>>> PetscPrintf(PETSC_COMM_WORLD,"\n DMs equal\n");
>>>>
>>>> }
>>>>
>>>> else
>>>>
>>>> {
>>>>
>>>> PetscPrintf(PETSC_COMM_WORLD,"\n DMs are not equal\n\n");
>>>>
>>>> }
>>>>
>>>> }
>>>>
>>>> #endif
>>>>
>>>>
>>>>
>>>> ierr = VecDestroy(&V); CHKERRQ(ierr);
>>>>
>>>> ierr = VecDestroy(&distV); CHKERRQ(ierr);
>>>>
>>>> //ierr = VecDestroy(&seqV); CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = DMDestroy(&dm);CHKERRQ(ierr);
>>>>
>>>> ierr = DMDestroy(&distDM);CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> ierr = PetscFinalize();
>>>>
>>>> return 0;
>>>>
>>>> }
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150402/75b81532/attachment-0001.html>
More information about the petsc-users
mailing list