[petsc-users] unable to distribute rows of a matrix across processors while loading
Matthew Knepley
knepley at gmail.com
Sun Aug 27 08:04:51 CDT 2023
On Sun, Aug 27, 2023 at 8:30 AM maitri ksh <maitri.ksh at gmail.com> wrote:
> Hi,
> I am using MatSetSizes() followed by MatLoad() to distribute the rows of a
> *sparse* matrix (*480000x480000*) across the processors. But it seems
> like the entire matrix is getting loaded in each of the processors instead
> of distributing it. What am I missing here?
>
> *code snippet:*
> Mat Js;
> MatType type = MATMPIAIJ;
> PetscViewer viewerJ;
> PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD, "Js.dat",
> FILE_MODE_READ, &viewerJ));
> PetscCall(MatCreate(PETSC_COMM_WORLD, &Js));
> PetscCall(MatSetSizes(Js, PETSC_DECIDE, PETSC_DECIDE, N, N));
> PetscCall(MatSetType(Js, type));
> PetscCall(MatLoad(Js, viewerJ));
> PetscCall(PetscViewerDestroy(&viewerJ));
> PetscCall(MatGetLocalSize(Js, &m, &n));
> PetscCall(MatGetSize(Js, &M, &N));
> PetscPrintf(PETSC_COMM_WORLD, "Js,Local rows: %d, Local columns:
> %d\n", m, n);
>
If this was really PETSC_COMM_WORLD, then this print statement would only
output once, but you have 4 lines. Therefore, MPI is messed up. I am
guessing you used an 'mpirun' or 'mpiexec' that is from a different MPI,
and therefore the launch did not work correctly. If you had PETSc install
MPI, then the correct one is in
${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec
Thanks,
Matt
>
> *Output *of 'mpiexec -n 4 ./check':
> Js,Local rows: 480000, Local columns: 480000
> Js,Local rows: 480000, Local columns: 480000
> Js,Local rows: 480000, Local columns: 480000
> Js,Local rows: 480000, Local columns: 480000
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230827/06a9dc00/attachment-0001.html>
More information about the petsc-users
mailing list