[petsc-users] unstructured finite volume method matrix assembly for partitioned mesh

Rahul Praghanmor praghanmor at gmail.com
Wed Jan 18 23:52:59 CST 2012


my apologies...!!
I am using METIS for partitioning the domain.

On Wed, Jan 18, 2012 at 12:11 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Jan 17, 2012 at 12:26 PM, Rahul Praghanmor <praghanmor at gmail.com>wrote:
>
>> Dear Sir,
>>           I am working on a parallel unstructured finite volume
>> solver.The solver is efficiently running in parallel using gauss seidel
>> linear solver.The matrix is sparse and stored by CSR format.Now I want to
>> implement a PETSc library to make the convergence faster.But I am going
>> through a major problem as discussed below.
>>           If I partitioned a big domain say rectangular duct into 4 zones
>> using parMetis.Each zone is solved in separate processor as fairly solved
>> by gauss seidel linear solver.But I want to solve these zones by PETSc.How
>> to do that?How to form a matrix with global numbering which is required
>> format for PETSc to form a matrix?Does it necessarily important to form a
>> global matrix? very few information available for assembling a matrix from
>> unstructured finite volume method in PETSc.
>>
>
> If you already run ParMetis, just number the rows it puts on each process
> consecutively.
>
>    Matt
>
>
>> Thankx and regards,
>> Rahul.
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
Zeus Numerix Pvt. Ltd.
I2IT Campus, P-14
Rajiv Gandhi Infotech Park, Phase-1
Hinjewadi Pune -57
0ff.   +91 64731511/9965
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120119/750da5ae/attachment.htm>


More information about the petsc-users mailing list