[petsc-users] MatView question
Matthew Knepley
knepley at gmail.com
Mon Dec 5 06:57:55 CST 2011
On Mon, Dec 5, 2011 at 6:48 AM, Treue, Frederik <frtr at risoe.dtu.dk> wrote:
> Hi,****
>
> ** **
>
> I’m confused about the matview command, or rather, it’s output. The
> following lines of code:****
>
> int rank;****
>
> MPI_Comm_rank(PETSC_COMM_WORLD,&rank);****
>
> MatView((*FD).ddx,PETSC_VIEWER_STDOUT_WORLD);****
>
> int fr,lr;****
>
> MatGetOwnershipRange((*FD).ddx,&fr,&lr);****
>
> printf("process %d has rows %d to %d\n",rank,fr,lr);****
>
> Produce the following output:****
>
> >mpirun -np 4 ./petsclapl ****
>
> Matrix Object: 1 MPI processes****
>
> type: mpiaij****
>
> row 0: (0, 5) (1, 5) (12, 0) (13, 0) (108, 0) (109, 0) ****
>
> row 1: (0, -5) (1, 0) (2, 5) (12, 0) (13, 0) (14, 0) (108, 0)
> (109, 0) (110, 0)****
>
> [snip]****
>
> process 0 has rows 0 to 30****
>
> process 1 has rows 30 to 60****
>
> process 2 has rows 60 to 90****
>
> process 3 has rows 90 to 120****
>
> ** **
>
> My question is this: How come the MatView command states 1 MPI processes,
> while, when I use GetOwnershipRange, the (expected) division into ¼ of the
> rows at each process is reported? Is the MatView “processes” the process of
> the viewer? And more importantly, what is the actual position of my matrix
> – is it on 1 or on 4 processes?
>
This is a little confusing. We gather the matrix onto process 0 so we can
print it, so it shows
1 process.
Matt
>
>
> ---****
>
> yours sincerily****
>
> Frederik Treue****
>
> ** **
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111205/d8547dad/attachment.htm>
More information about the petsc-users
mailing list