[petsc-users] PETSc and threads
Jed Brown
jedbrown at mcs.anl.gov
Fri Jan 4 13:38:27 CST 2013
Yes, you can do anything independently on different communicators. For
example, each process can create objects on PETSC_COMM_SELF and solve them
independently. You can also do each solve in parallel on distinct
subcommunicators.
On Fri, Jan 4, 2013 at 1:34 PM, Petar Petrovic <ppetrovic573 at gmail.com>wrote:
> MPI processes are fine, I didn't express myself clearly. Let me try with
> another example.
> Lets say I need to solve n linear systems Ax = b_i, i=1..n and I want to
> do this in parallel. I would like to employ n processes to do this, so each
> i-th process does A\b_i, and they all run in parallel. Something like:
>
> for(i=0; i<n; i++)
> solve(Ax=b_i)
>
> where I would like to run this for loop in parallel. Can I do this?
>
> Let me just note that it doesn't necessarily need to be a A\b operation.
> There are parts of my program that are embarrassingly parallel which I
> would like to exploit.
>
>
> On Fri, Jan 4, 2013 at 8:05 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> This is not supported due to logging/profiling/debugging features. Can
>> you explain more about your problem and why you don't want to use MPI
>> processes?
>>
>>
>> On Fri, Jan 4, 2013 at 12:01 PM, Petar Petrovic <ppetrovic573 at gmail.com>wrote:
>>
>>> Hello,
>>> I have read that PETSc isn't thread safe, however I am not sure I
>>> understand in which way. What I would like to do is execute parts of my
>>> code that make calls to PETSc routines in parallel, for example,
>>> if I have matices A, B, C and D, is it possible to call MatMatMult(A,B)
>>> on one thread and MatMatMult(C,D) on the other thread in parallel? Is there
>>> an example which shows how to do something this, e.g. combine PETSc with
>>> mine MPI calls or something similar?
>>> Thank you very much for your help.
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130104/b4fd90cf/attachment.html>
More information about the petsc-users
mailing list