[petsc-users] Meaning of the order of PETSC_VIEWER_ASCII_INDEX format in PETSc
Smith, Barry F.
bsmith at mcs.anl.gov
Wed Feb 7 13:29:51 CST 2018
Here is the code VecView_MPI_ASCII()
if (format == PETSC_VIEWER_ASCII_INDEX) {
ierr = PetscViewerASCIIPrintf(viewer,"%D: ",cnt++);CHKERRQ(ierr);
}
#if defined(PETSC_USE_COMPLEX)
if (PetscImaginaryPart(xarray[i]) > 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g + %g i\n",(double)PetscRealPart(xarray[i]),(double)PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
} else if (PetscImaginaryPart(xarray[i]) < 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g - %g i\n",(double)PetscRealPart(xarray[i]),-(double)PetscImaginaryPart(xarray[i]));CHKERRQ(ierr);
} else {
ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)PetscRealPart(xarray[i]));CHKERRQ(ierr);
}
#else
ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)xarray[i]);CHKERRQ(ierr);
#endif
}
/* receive and print messages */
for (j=1; j<size; j++) {
ierr = MPI_Recv(values,(PetscMPIInt)len,MPIU_SCALAR,j,tag,PetscObjectComm((PetscObject)xin),&status);CHKERRQ(ierr);
ierr = MPI_Get_count(&status,MPIU_SCALAR,&n);CHKERRQ(ierr);
if (format != PETSC_VIEWER_ASCII_COMMON) {
ierr = PetscViewerASCIIPrintf(viewer,"Process [%d]\n",j);CHKERRQ(ierr);
}
for (i=0; i<n; i++) {
if (format == PETSC_VIEWER_ASCII_INDEX) {
ierr = PetscViewerASCIIPrintf(viewer,"%D: ",cnt++);CHKERRQ(ierr);
}
#if defined(PETSC_USE_COMPLEX)
if (PetscImaginaryPart(values[i]) > 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g + %g i\n",(double)PetscRealPart(values[i]),(double)PetscImaginaryPart(values[i]));CHKERRQ(ierr);
} else if (PetscImaginaryPart(values[i]) < 0.0) {
ierr = PetscViewerASCIIPrintf(viewer,"%g - %g i\n",(double)PetscRealPart(values[i]),-(double)PetscImaginaryPart(values[i]));CHKERRQ(ierr);
} else {
ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)PetscRealPart(values[i]));CHKERRQ(ierr);
}
#else
ierr = PetscViewerASCIIPrintf(viewer,"%g\n",(double)values[i]);CHKERRQ(ierr);
#endif
}
}
So each process ships its values to process zero who prints them in order.
Note that printing out vectors and matrices as ASCII is just for toys and to help debug. For large runs one should always use some variant of binary output.
Barry
> On Feb 7, 2018, at 1:08 PM, Paula Sanematsu <paulasan at gmail.com> wrote:
>
> I am using PETSc 3.7.6 and Fortran.
>
> I am trying to output a PETSc vector that contains the solution of a linear system. I am using VecView with the PETSC_VIEWER_ASCII_INDEX format as follows:
>
> call PetscViewerASCIIOpen(PETSC_COMM_WORLD,"output.dat",viewer,ierr)
> call PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INDEX,ierr)
> call VecView(myVec,viewer,ierr)
>
> When I run with 4 processors, my output file looks like:
>
> Vec Object: 4 MPI processes
> type: mpi
> Process [0]
> 0: 30.7501
> 1: 164.001
> 2: 41.0001
> 3: 164.001
> .
> .
> .
> Process [1]
> 4988: 60.1443
> 4989: 157.257
> 4990: 271.518
> 4991: 366.669
> .
> .
> .
> Process [2]
> 9977: 114.948
> 9978: -77.2896
> 9979: 823.142
> 9980: -1096.19
> .
> .
> .
> Process [3]
> 14916: 0.
> 14917: 4.4056
> 14918: 2.08151
> 14919: -0.110862
> .
> .
> .
> 19843: 0.
>
> My question is: each processor outputs the part of the vector that it owns? Or does PETSc collects each processor's parts and then processor 0 sequentially outputs the 1st quarter of the global vector, processor 1 outputs the 2nd quarter of the global vector, processor 2 outputs the 3rd quarter of the global vector, and so on? Or, does PETSc do something else?
>
> Thank you!
>
> Paula
>
More information about the petsc-users
mailing list