[petsc-users] coupling with Matlab and parallel solution
Benjamin Sanderse
B.Sanderse at cwi.nl
Fri Sep 10 17:00:02 CDT 2010
Hi Jed,
I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners?
Furthermore, I get these messages:
[1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines.
Does Petsc give a hint here to use other routines, or does it indicate what it is doing?
Ben
Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven:
> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse <B.Sanderse at cwi.nl> wrote:
>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab:
>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>
>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>
>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas?
>
> Please provide the output, "doesn't work" is not much information.
>
>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner?
>
> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM
> + ICC (-pc_type asm -sub_pc_type icc).
>
> Jed
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20100910/271bdf1a/attachment.htm>
More information about the petsc-users
mailing list