[petsc-dev] Matlab implementation of PC

Dave May dave.mayhem23 at gmail.com
Fri Dec 10 06:47:33 CST 2010

Hey Barry,
  Looking through the docs associated with loadlibrary, calllib etc.,
it wasn't obvious to me how to  extract a global variable
(PETSC_COMM_SELF) from a c library.

One possible solution may be to write a function
  PetscErrorCode PetscGetCommSelf(MPI_Comm *comm)
which simply sets *comm = PETSC_COMM_SELF. If this function was part
of libpetsc, then it could be accessed using calllib. Since it's a
trivial getter which possibly doesn't belong in the libpetsc, it could
be wrapped up with a standalone mex function. Does anyone know a
better way?

Am I missing something obvious related to how to pull out COMM_SELF from petsc?


On 9 December 2010 23:04, Barry Smith <bsmith at mcs.anl.gov> wrote:
> On Dec 9, 2010, at 3:52 PM, Dave May wrote:
>> Ok, after digging a little bit more, I realise I can configure the pc
>> in a command line like
>> manner using this syntax '-pc_type','lu'.
>> What I was hoping to use was ML and hypre, however it appears neither
>> of these libraries can be built with --with-mpi=0, which seems to be a
>> requirement when calling petsc from matlab.
>> Is there a straight forward way to enable hypre and ml to be used when
>> --with-mpi=0?
> No. I won't recommend monkeying with that.
> A much better way to do it would be to "improve" the Matlab interface to support running under MPI; first step of course just sequential (one process MPI).
> This would be done by putting a PETSC_COMM_SELF field into PetscObject as a nonconstant property and then have PetscInitalize.m pull the PETSC_COMM_SELF up from C and put it into PetscObject. Then calls to, for example, KSPCreate() from Matlab would automatically pass that property in (currently they just pass 0 in, which works for MPI Uni).
>   Barry
>> Cheers,
>>  Dave
>> On 9 December 2010 22:26, Dave May <dave.mayhem23 at gmail.com> wrote:
>>> Hi Barry,
>>>  Is it possible that you could possible add PetscPC.m?
>>> I work with a bunch of matlab users solving Stokes on staggered grid who
>>> rely on umfpack. I'd love to be able to get them to try ML or hypre via PETSc.
>>> Cheers,
>>>  Dave

More information about the petsc-dev mailing list