[petsc-dev] Segmentation faults in MatMatMult & MatTransposeMatMult

Jed Brown jed at jedbrown.org
Mon Jan 14 23:26:52 CST 2019


We should repair the MPI matrix implementations so that this works on communicators of size 1, but why can't you use MatXAIJSetPreallocation().

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatXAIJSetPreallocation.html

Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov> writes:

> Cf. the end of my sentence: "(I know, I could switch to SeqAIJ_SeqDense, but that is not an option I have right now)”
> All my Mat are of type MATMPIX. Switching to MATX here as you suggested would mean that I need to add a bunch of if(comm_size == 1) MatSeqXSetPreallocation else MatMPIXSetPreallocation in the rest of my code, which is something I would rather avoid.
>
> Thanks,
> Pierre
>
>> On 14 Jan 2019, at 10:30 PM, Zhang, Hong <hzhang at mcs.anl.gov> wrote:
>> 
>> Replace 
>> ierr = MatSetType(A, MATMPIAIJ);CHKERRQ(ierr);
>> to
>> ierr = MatSetType(A, MATAIJ);CHKERRQ(ierr);
>> 
>> Replace 
>> ierr = MatSetType(B, MATMPIDENSE)i;CHKERRQ(ierr);
>> to
>> ierr = MatSetType(B, MATDENSE)i;CHKERRQ(ierr);
>> 
>> Then add
>> MatSeqAIJSetPreallocation()
>> MatSeqDenseSetPreallocation()
>> 
>> Hong
>> 
>> On Mon, Jan 14, 2019 at 2:51 PM Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov>> wrote:
>> Hello,
>> Is there any chance to get MatMatMult_MPIAIJ_MPIDense  and MatTransposeMatMult_MPIAIJ_MPIDense fixed so that the attached program could run _with a single_ process? (I know, I could switch to SeqAIJ_SeqDense, but that is not an option I have right now)
>> 
>> Thanks in advance,
>> Pierre
>> 


More information about the petsc-dev mailing list