[petsc-users] coupling with Matlab and parallel solution

Barry Smith bsmith at mcs.anl.gov
Fri Sep 10 21:18:48 CDT 2010


  Why don't you see how it goes. Where the time seems to be spent for large runs and then if need be you could move more over to the PETSc code.

   Barry

On Sep 10, 2010, at 8:13 PM, Benjamin Sanderse wrote:

> Hi Barry,
> 
> Thanks for your comments. I have thought of some of these options as well. There is one (big) thing:
> I do not build the Laplacian simply by programming its entries directly, but I generate it as being the product of a divergence and a gradient matrix. These divergence and gradient matrices are also used in other parts of my code. In the end, there are a lot of matrix generation routines which (very) efficiently and elegantly build my entire discretization. I am not sure if I can easily transfer that to C or Fortran, since it heavily relies on the sparse-matrix features of Matlab (like spdiags).
> 
> Maybe you have suggestions for this? For now I stick to Matlab since I love it for prototyping, but I want to solve the pressure matrix a bit faster and that's why I am looking at this 'quick and dirty' solution.
> 
> Ben
> 
> 
> 
> Op 10 sep 2010, om 18:25 heeft Barry Smith het volgende geschreven:
> 
>> 
>> On Sep 10, 2010, at 5:00 PM, Benjamin Sanderse wrote:
>> 
>>> Hi Jed,
>>> 
>>> I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners?
>>> Furthermore, I get these messages:
>>> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines.
>>> 
>>> Does Petsc give a hint here to use other routines, or does it indicate what it is doing?
>> 
>>    No you can ignore this message.
>> 
>>    I recommend you use boomerAMG as your preconditioner -pc_type hyper -pc_hypre_type boomeramg you must first have configured PETSc with --download-hypre
>> 
>>    BTW: your matrix is so simple it doesn't seem to make sense to be generating it in Matlab and shipping it over to PETSc, you should generate the matrix in PETSc (and likely the vectors also) and just use Matlab for visualization or stuff like that. If you use the PETSc DA to parallelize the PETSc code and generate the matrix in PETSc you can use geometric multigrid to solve the system and it will scream in parallel.
>> 
>>> 
>>> Ben
>>> 
>>> Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven:
>>> 
>>>> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse <B.Sanderse at cwi.nl> wrote:
>>>>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab:
>>>>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>>>> 
>>>>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &);
>>>>> 
>>>>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas?
>>>> 
>>>> Please provide the output, "doesn't work" is not much information.
>>>> 
>>>>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner?
>>>> 
>>>> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM
>>>> + ICC (-pc_type asm -sub_pc_type icc).
>>>> 
>>>> Jed
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20100910/4de27d80/attachment.htm>


More information about the petsc-users mailing list