[petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected
Matthew Knepley
knepley at gmail.com
Fri Nov 23 12:25:13 CST 2018
The other possibility is that you are using the "wrong" mpiexec.
Matt
On Fri, Nov 23, 2018 at 1:04 PM Klaus Burkart via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> The output is:
>
> local_size = 25, on process 0
> rstart = 0, on process 0
> rend = 25, on process 0
> local_size = 25, on process 1
> rstart = 0, on process 1
> rend = 25, on process 1
> local_size = 25, on process 2
> rstart = 0, on process 2
> rend = 25, on process 2
> local_size = 25, on process 3
> rstart = 0, on process 3
> rend = 25, on process 3
>
> Using PETSC_COMM_SELF has no effect.
>
> Am Freitag, 23. November 2018, 18:02:45 MEZ hat zakaryah <
> zakaryah at gmail.com> Folgendes geschrieben:
>
>
> What does your output look like? With PETSC_COMM_WORLD, I think you will
> only get the output from the rank 0 process. Try with PETSC_COMM_SELF.
>
> On Fri, Nov 23, 2018, 11:56 AM Matthew Knepley via petsc-users <
> petsc-users at mcs.anl.gov wrote:
>
> On Fri, Nov 23, 2018 at 9:54 AM Klaus Burkart via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Hello,
>
> I am trying to compute the local row ranges allocated to the processes
> i.e. rstart and rend of each process, needed as a prerequisite for
> MatMPIAIJSetPreallocation using d_nnz and o_nnz.
>
> I tried the following:
>
> ...
>
> PetscInitialize(0,0,PETSC_NULL,PETSC_NULL);
>
> MPI_Comm_size(PETSC_COMM_WORLD,&size);
> MPI_Comm_rank(PETSC_COMM_WORLD,&rank);
>
> MatCreate(PETSC_COMM_WORLD,&A);
> MatSetType(A,MATMPIAIJ);
> PetscInt local_size = PETSC_DECIDE;
> PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N);
> MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD);
>
>
> This looks right to me. Not sure what your problem is. However, you can
> always use PetscLayout to do
> this automatically.
>
> Thanks,
>
> Matt
>
>
> rstart = rend - local_size;
> PetscInt d_nnz[local_size], o_nnz[local_size];
> /*
>
> compute d_nnz and o_nnz here
>
> MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz);
> */
>
> for (rank = 0; rank < size; rank++) {
> PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n",
> local_size, rank);
> PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart,
> rank);
> PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend,
> rank);
> }
>
> PetscFinalize();
>
> The local size is 25 rows on each process but rstart and rend are 0 and 25
> on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101.
> N = 100
>
> I can't spot the error. Any ideas, what's the problem?
>
> Klaus
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181123/f1b61fce/attachment.html>
More information about the petsc-users
mailing list