[petsc-users] How to efficiently fill in, in parallel, a PETSc matrix from a COO sparse matrix?

Matthew Knepley knepley at gmail.com
Tue Jun 20 13:07:49 CDT 2023


On Tue, Jun 20, 2023 at 2:02 PM Diego Magela Lemos <diegomagela at usp.br>
wrote:

> So... what do I need to do, please?
> Why am I getting wrong results when solving the linear system if the
> matrix is filled in with MatSetPreallocationCOO and MatSetValuesCOO?
>

It appears that you have _all_ processes submit _all_ triples (i, j, v).
Each triple can only be submitted by a single process. You can fix this in
many ways. For example, an easy but suboptimal way is just to have process
0 submit them all, and all other processes submit nothing.

  Thanks,

  Matt


> Em ter., 20 de jun. de 2023 às 14:56, Jed Brown <jed at jedbrown.org>
> escreveu:
>
>> Matthew Knepley <knepley at gmail.com> writes:
>>
>> >> The matrix entries are multiplied by 2, that is, the number of
>> processes
>> >> used to execute the code.
>> >>
>> >
>> > No. This was mostly intended for GPUs, where there is 1 process. If you
>> > want to use multiple MPI processes, then each process can only introduce
>> > some disjoint subset of the values. This is also how MatSetValues()
>> works,
>> > but it might not be as obvious.
>>
>> They need not be disjoint, just sum to the expected values. This
>> interface is very convenient for FE and FV methods. MatSetValues with
>> ADD_VALUES has similar semantics without the intermediate storage, but it
>> forces you to submit one element matrix at a time. Classic parallelism
>> granularity versus memory use tradeoff with MatSetValuesCOO being a clear
>> win on GPUs and more nuanced for CPUs.
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230620/cd239617/attachment-0001.html>


More information about the petsc-users mailing list