[petsc-users] Fwd: Building the same petsc matrix with different numprocs gives different results!

Jed Brown jedbrown at mcs.anl.gov
Wed Sep 25 06:33:15 CDT 2013


Analabha Roy <hariseldon99 at gmail.com> writes:

> Progress Report:
>
>  So I modified my code to remove all the wrong usages of MatCopy where I
> was trying to AllScatter/AllGather a parallel matrix, and replaced with
> usage of MatGetRedundantMatrix(). The highlighted diff is
> here<http://goo.gl/yyJVcV>

Looks fine.

> Now, running it with 1 proc crashes, I guess because the documentation
> says<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetRedundantMatrix.html>that
> only parallel matrices are supported by MatGetRedundantMatrix(), and
> running with 1 proc means the parallel matrices are sequantial (from what I
> read in the PetSc users manual).

You can use this to run on a single process.

  MPI_Comm_size(PetscObjectComm(PetscObject)mat,&size);
  if (size > 1) {
    MatGetRedundantMatrix(mat,&rmat);
  } else {
    rmat = mat;
    PetscObjectReference((PetscObject)rmat);
  }

(In parallel, rmat is a copy, but in serial, rmat is the same matrix.)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130925/f829698e/attachment-0001.pgp>


More information about the petsc-users mailing list