[petsc-users] Petsc Matrix Redistribution
Matthew Knepley
knepley at gmail.com
Thu Dec 19 17:44:31 CST 2013
On Thu, Dec 19, 2013 at 5:20 PM, James A Charles <charlesj at purdue.edu>wrote:
> My original matrix is serial. I then want to distribute this across MPI
> processes for use with Arpack. MatGetSubMatrix wouldn't change
> communicators right?
>
When you create the matrix, use PETSC_COMM_WORLD and 0 sizes on the other
procs.
Matt
> ----- Original Message -----
> From: "Matthew Knepley" <knepley at gmail.com>
> To: "James A Charles" <charlesj at purdue.edu>
> Cc: petsc-users at mcs.anl.gov
> Sent: Thursday, December 19, 2013 5:58:25 PM
> Subject: Re: [petsc-users] Petsc Matrix Redistribution
>
>
>
>
> On Thu, Dec 19, 2013 at 4:44 PM, James A Charles < charlesj at purdue.edu >
> wrote:
>
>
> Hello,
>
> I want to redistribute a matrix across MPI processes (block tridiagonal of
> around size 20,000) for use with Arpack so that I can solve for eignepairs
> in parallel. Is this possible and if so what is the best way to do this?
>
>
>
> The easiet way is probably
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSubMatrix.html, but there is also
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatPermute.html.
>
>
> Matt
>
>
> Thanks,
> James Charles
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131219/fb68629e/attachment.html>
More information about the petsc-users
mailing list