[petsc-users] Regarding changing vector/matrix type at runtime

Mohammad Mirzadeh mirzadeh at gmail.com
Thu Aug 23 13:51:57 CDT 2012


> The CUSP case wouldn't help because chances are you are not holding
memory in CUSP format living on the device.
That makes sense.

Thanks for the help, Jed.

On Thu, Aug 23, 2012 at 11:40 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> On Thu, Aug 23, 2012 at 1:25 PM, Mohammad Mirzadeh <mirzadeh at gmail.com>wrote:
>
>> Hi guys,
>>
>> I've added a small function to only solve the linear system in parallel
>> for problems that are not big enough to justify parallelizing the whole
>> thing but would still take some time to solve in serial -- so far It's been
>> handy. To prevent extra copy from my own format to petsc's, I'm using
>> VecCreateMPIWithArray and MatCreateMPIWithSplitArray functions and they
>> work with MPI on very few processes (like 4 or so). I have a couple of
>> questions:
>>
>> 1) Is it worth considering using pThread and/or CUSP versions of the
>> matrices for such problems?
>>
>
> The old pthread types are being removed in favor of threading support for
> normal formats.
>
>
>> 2) If so, what would be a better approach. Using hardwired vector types
>> in the code or using generic VecCreate functions to be able to change the
>> type at run-time?
>>
>
> Just use VecSetType()/MatSetType() to switch. You should really profile
> just using MatSetValues() to assemble. It's very likely that you are
> needlessly complicating your code by trying to skip a copy.
>
>
>> 3) Are there equivalent versions of XXXWithArray functions for CUSP and
>> pThread types? I don't seem to find them in manual page.
>>
>
> The CUSP case wouldn't help because chances are you are not holding memory
> in CUSP format living on the device.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120823/cfbb2720/attachment.html>


More information about the petsc-users mailing list