[petsc-users] segfault in MatAssemblyEnd() when using large matrices on multi-core MAC OS-X
Ronald M. Caplan
caplanr at predsci.com
Mon Jul 30 17:12:13 CDT 2012
Attached is the code. The original code which segfaults with more than one
core is the code I sent last week.
- Ron C
On Mon, Jul 30, 2012 at 3:09 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> On Mon, Jul 30, 2012 at 3:04 PM, Ronald M. Caplan <caplanr at predsci.com>wrote:
>
>> I seem to have solved the problem.
>>
>> I was storing my entire matrix on node 0 and then calling MatAssembly
>> (begin and end) on all nodes (which should have worked...).
>>
>> Apparently I was using too much space for the buffering or the like,
>> because when I change the code so each node sets its own matrix values,
>> than the MatAssemblyEnd does not seg fault.
>>
>
> Can you send the test case. It shouldn't seg-fault unless the machine runs
> out of memory (and most desktop systems have overcommit, so the system will
> kill arbitrary processes, not necessarily the job that did the latest
> malloc.
>
> In practice, you should call MatAssemblyBegin(...,MAT_FLUSH_ASSEMBLY)
> periodically.
>
>
>>
>> Why should this be the case? How many elements of a vector or matrix
>> can a single node "set" before Assembly to distribute over all nodes?
>>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120730/81f16926/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: petsctest.F
Type: application/octet-stream
Size: 19329 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120730/81f16926/attachment-0001.obj>
More information about the petsc-users
mailing list