[petsc-users] Can I use TAO for embarrassingly parallel problems with multiple threads?
Krzysztof Kamieniecki
krys at kamieniecki.com
Thu Dec 20 09:46:37 CST 2018
Hi Matt,
What does "COMPLETELY independent" mean? Should I call PetscInitialize in
each thread?
We have an "approximate" model where we treat different regions as
independent. It's a "realtime" system where measurements from the different
regions arrives a different times and we want to process regions ASAP.
Best Regards,
Krys
On Thu, Dec 20, 2018 at 10:35 AM Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Dec 20, 2018 at 10:30 AM Krzysztof Kamieniecki via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Hello All,
>>
>> I have an embarrassingly parallel problem that I would like to use TAO
>> on, is there some way to do this with threads as opposed to multiple
>> processes?
>>
>
> You should be able to run COMPLETELY independent object on different
> threads. However, notice that
> there is absolutely no advantage to doing this. Why do you want to run
> independent instances on multiple
> threads?
>
> Thanks,
>
> Matt
>
>
>> I compiled PETSc with the following flags
>> ./configure \
>> --prefix=${DEP_INSTALL_DIR} \
>> --with-threadsafety --with-log=0 --download-concurrencykit \
>> --with-openblas=1 \
>> --with-openblas-dir=${DEP_INSTALL_DIR} \
>> --with-mpi=0 \
>> --with-shared=0 \
>> --with-debugging=0 COPTFLAGS='-O3' CXXOPTFLAGS='-O3' FOPTFLAGS='-O3'
>>
>> When I run TAO in multiple threads I get the error "Called VecxxxEnd() in
>> a different order or with a different vector than VecxxxBegin()"
>>
>> Thanks,
>> Krys
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181220/15e9acd9/attachment.html>
More information about the petsc-users
mailing list