[petsc-users] How to view petsc array?

Dave May dave.mayhem23 at gmail.com
Wed Aug 19 04:56:35 CDT 2015


On 19 August 2015 at 11:08, TAY wee-beng <zonexo at gmail.com> wrote:

>
> On 19/8/2015 4:58 PM, Dave May wrote:
>
>
>
> On 19 August 2015 at 10:54, TAY wee-beng <zonexo at gmail.com> wrote:
>
>>
>> On 21/7/2015 7:28 PM, Matthew Knepley wrote:
>>
>> On Tue, Jul 21, 2015 at 1:35 AM, TAY wee-beng <zonexo at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I need to check the contents of the array which was declared using:
>>>
>>> PetscScalar,pointer ::
>>> u_array(:,:,:),v_array(:,:,:),w_array(:,:,:),p_array(:,:,:)
>>>
>>> I tried to use :
>>>
>>> call PetscViewerASCIIOpen(MPI_COMM_WORLD,"pres.txt",viewer,ierr)
>>>
>>> call VecView(p_array,viewer,ierr)
>>>
>>> or
>>>
>>> call MatView(p_array,viewer,ierr)
>>>
>>> call PetscViewerDestroy(viewer,ierr)
>>>
>>> but I got segmentation error. So is there a PETSc routine I can use?
>>
>>
>> No. Those routines work only for Vec objects. You could
>>
>>  a) Declare a DMDA of the same size
>>
>>  b) Use DMDAVecGetArrayF90() to get out the multidimensional array
>>
>>  c) Use that in your code
>>
>>  d) Use VecView() on the original vector
>>
>>
>> Hi,
>>
>> Supposed I need to check the contents of the u_array which was declared
>> using:
>>
>> PetscScalar,pointer :: u_array(:,:,:)
>>
>> call
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>>
>>
>> size_z,1,PETSC_DECIDE,PETSC_DECIDE,1,stencil_width,lx,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da_u,ierr)
>>
>> call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
>>
>> call PetscViewerASCIIOpen(MPI_COMM_WORLD,"u.txt",viewer,ierr)
>>
>> call VecView(array,viewer,ierr)
>>
>
>
> The first argument of VecView must be of type Vec (as Matt noted).
> It looks you are passing in an array of PetscScalar's.
>
>
> Oh so should it be:
>
> Vec u_global,u_local
>
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>
>
> size_z,1,PETSC_DECIDE,PETSC_DECIDE,1,stencil_width,lx,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da_u,ierr)
>
> call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
>
> call PetscViewerASCIIOpen(MPI_COMM_WORLD,"u.txt",viewer,ierr)
>
> call VecView(u_local,viewer,ierr)
>
> call PetscViewerDestroy(viewer,ierr)
>


Yes, the arguments types now match.

However, if you run this in parallel there will be two issues:

(1) You have a different local vector per process, thus you will need to
use a unique file name, e.g. "u-rankXXX.txt" to avoid overwriting the data
from each process

(2) You need to make sure that the communicator used for the viewer and the
vector are the same.

To implement (1) and (2) you could do something like this:

MPI_Comm comm;
PetscMPIInt rank;
char filename[PETSC_MAX_PATH_LEN];

ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
ierr =
PetscSNPrintf(filename,PETSC_MAX_PATH_LEN-1,"u-%d.txt",rank);CHKERRQ(ierr);
ierr = PetscObjectGetComm(((PetscObject)u_local,&comm);CHKERRQ(ierr);
ierr = PetscViewerASCIIOpen(comm,filename,viewer);CHKERRQ(ierr);




>
>
>
>
>>
>> call PetscViewerDestroy(viewer,ierr)
>>
>> Is this the correct way?
>>
>>
>>    Matt
>>
>>
>>>
>>> --
>>> Thank you
>>>
>>> Yours sincerely,
>>>
>>> TAY wee-beng
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150819/77ba3274/attachment-0001.html>


More information about the petsc-users mailing list