[petsc-users] Fwd: Building the same petsc matrix with different numprocs gives different results!
Analabha Roy
hariseldon99 at gmail.com
Wed Sep 25 22:25:49 CDT 2013
On Sep 25, 2013 7:28 PM, "Jed Brown" <jedbrown at mcs.anl.gov> wrote:
>
> Analabha Roy <hariseldon99 at gmail.com> writes:
>
> > Progress Report:
> >
> > So I modified my code to remove all the wrong usages of MatCopy where I
> > was trying to AllScatter/AllGather a parallel matrix, and replaced with
> > usage of MatGetRedundantMatrix(). The highlighted diff is
> > here<http://goo.gl/yyJVcV>
>
> Looks fine.
>
> > Now, running it with 1 proc crashes, I guess because the documentation
> > says<
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetRedundantMatrix.html
>that
> > only parallel matrices are supported by MatGetRedundantMatrix(), and
> > running with 1 proc means the parallel matrices are sequantial (from
what I
> > read in the PetSc users manual).
>
> You can use this to run on a single process.
>
> MPI_Comm_size(PetscObjectComm(PetscObject)mat,&size);
> if (size > 1) {
> MatGetRedundantMatrix(mat,&rmat);
> } else {
> rmat = mat;
> PetscObjectReference((PetscObject)rmat);
> }
>
> (In parallel, rmat is a copy, but in serial, rmat is the same matrix.)
Thanks very much.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130926/d8ffcc26/attachment.html>
More information about the petsc-users
mailing list