[petsc-users] Using PFFT within PETSc

Barry Smith bsmith at mcs.anl.gov
Tue Nov 24 10:41:28 CST 2015


  I don't understand what you mean but likely the issue comes from the fact that DMDA vectors are automatically saved to disk in the natural ordering (unlike other vectors you may create) independent of the parallel layout of the vector.

   Barry

> On Nov 24, 2015, at 8:00 AM, Giuseppe Pitton <gpitton at sissa.it> wrote:
> 
> Thanks Barry. Indeed, the problem is the different vector distribution between PETSc and FFTW/PFFT.
> This however still leaves open the issue of how to deal with an array coming from FFTW.
> In the attached code, the vectors u and uy are saved correctly in hdf5, but for some reason the vector ux is not (not in parallel at least). I cannot find the error, to me the three vectors look as they are written exactly in the same way.
> 
> Giuseppe
> 
> 
> 
> On 11/23/2015 11:03 PM, Barry Smith wrote:
>>    Your issues are likely due to a difference in how PETSc and pfft() think the "vectors" are laid out across processes
>> 
>> You seem to assume that DMDACreate2d() and pfft_create_procmesh() will make the same decisions about parallel layout; you print some information from the pfft_create()
>> 
>>   pfft_init();
>>   pfft_create_procmesh_1d(PETSC_COMM_WORLD,ncpus,&comm_cart_1d);
>>   alloc_local = pfft_local_size_dft(2,n,comm_cart_1d,PFFT_TRANSPOSED_NONE,local_no,local_o_start,local_ni,local_i_start);
>> 
>>   PetscPrintf(PETSC_COMM_SELF,"alloc_local: %d\n",(PetscInt)alloc_local);
>>   PetscPrintf(PETSC_COMM_SELF,"local_no: %d %d\n",(PetscInt)local_no[0],(PetscInt)local_no[1]);
>>   PetscPrintf(PETSC_COMM_SELF,"local_ni: %d %d\n",(PetscInt)local_ni[0],(PetscInt)local_ni[1]);
>>   PetscPrintf(PETSC_COMM_SELF,"local_o_start: %d %d\n",(PetscInt)local_o_start[0],(PetscInt)local_o_start[1]);
>>   PetscPrintf(PETSC_COMM_SELF,"local_i_start: %d %d\n",(PetscInt)local_i_start[0],(PetscInt)local_i_start[1]);
>> 
>> but do not check anything about the layout DMDACreate2d() selected. First you need to call DMDAGetInfo() and DMDAGetLocalInfo() to see the layout PETSc is using and make sure it matches pfft
>> 
>> 
>>    Barry
>> 
>> 
>>> On Nov 23, 2015, at 1:31 AM, Giuseppe Pitton <gpitton at sissa.it> wrote:
>>> 
>>> Dear users and developers,
>>> I am trying to interface PETSc with the parallel fast Fourier transform library PFFT (https://www-user.tu-chemnitz.de/~potts/workgroup/pippig/software.php.en), based in turn on FFTW. My plan is to build a spectral differentiation code, and in the attached files you can see a simple example.
>>> The code works correctly in serial, but in parallel there are some problems regarding the output of the results, I think due to some differences in the way PETSc and PFFT store data, but I'm not sure if this is really the issue.
>>> In the attached code, the number of processors used should be specified at compile time in the variable "ncpus". As long as ncpus = 1, everything works fine, but if ncpus = 2 or an higher power of 2, the code terminates correctly but the results show some artifacts, as you can see from the generated hdf5 file, named "output-pfft.h5".
>>> In the makefile the variables PFFTINC, PFFTLIB, FFTWINC and FFTWLIB should be set correctly.
>>> Thank you,
>>> Giuseppe
>>> 
>>> <makefile.txt><test-pfft.c>
> 
> <test-fftw.c>



More information about the petsc-users mailing list