[petsc-users] Argument out of range error only in certain mpi sizes

Smith, Barry F. bsmith at mcs.anl.gov
Wed Apr 10 22:09:21 CDT 2019


   Sajid,

      By default when you save/load vectors from DMDA to HDF5 files it 

1) converts them to the natural ordering in the file (in PETSc programs they are number by process (see the discussions in the users manual about DMDA orderings))

2) it treats it as a 2d array in the HDF5 file (as you discovered). 

This is because it is then easy for other programs like visualization to read the 2d HDF5 arrays and treat them as 2d arrays for further processing (and it doesn't matter how many MPI processes PETSc used the file has the same ordering).

   Barry


> On Apr 10, 2019, at 6:12 PM, Sajid Ali <sajidsyed2021 at u.northwestern.edu> wrote:
> 
> 
> 
> Thanks a lot for the advice Matt and Barry. 
> 
> One thing I wanted to confirm is that when I change from using a regular Vec to a Vec created using DMDACreateGlobalVector, to fill these with data from hdf5, I have to change the dimensions of hdf5 vectors from (dim_x*dim_y) to (dim_x,dim_y), right? 
> 
> Because I see that if I write to hdf5 from a complex vector created using DMDA, I get a vector that has dimensions (dim_x,dim_y,2) but before I saw the dimension of the same to be (dim_x*dim_y,2). 
> 
> -- 
> Sajid Ali
> Applied Physics
> Northwestern University



More information about the petsc-users mailing list