[petsc-users] coupling with Matlab and parallel solution
Barry Smith
bsmith at mcs.anl.gov
Fri Sep 10 19:25:32 CDT 2010
On Sep 10, 2010, at 5:00 PM, Benjamin Sanderse wrote:
> Hi Jed,
>
> I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners?
> Furthermore, I get these messages:
> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines.
>
> Does Petsc give a hint here to use other routines, or does it indicate what it is doing?
No you can ignore this message.
I recommend you use boomerAMG as your preconditioner -pc_type hyper -pc_hypre_type boomeramg you must first have configured PETSc with --download-hypre
BTW: your matrix is so simple it doesn't seem to make sense to be generating it in Matlab and shipping it over to PETSc, you should generate the matrix in PETSc (and likely the vectors also) and just use Matlab for visualization or stuff like that. If you use the PETSc DA to parallelize the PETSc code and generate the matrix in PETSc you can use geometric multigrid to solve the system and it will scream in parallel.
>
> Ben
>
> Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven:
>
>> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse <B.Sanderse at cwi.nl> wrote:
>>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab:
>>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>>
>>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>>
>>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas?
>>
>> Please provide the output, "doesn't work" is not much information.
>>
>>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner?
>>
>> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM
>> + ICC (-pc_type asm -sub_pc_type icc).
>>
>> Jed
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20100910/ee6582e4/attachment.htm>
More information about the petsc-users
mailing list