[petsc-users] VecAssembly gives segmentation fault with MPI
Francesco Migliorini
francescomigliorini93 at gmail.com
Wed Apr 19 05:26:05 CDT 2017
Hello!
I have an MPI code in which a linear system is created and solved with
PETSc. It works in sequential run but when I use multiple cores the
VecAssemblyBegin/End give segmentation fault. Here's a sample of my code:
call PetscInitialize(PETSC_NULL_CHARACTER,perr)
ind(1) = 3*nnod_loc*max_time_deg
call VecCreate(PETSC_COMM_WORLD,feP,perr)
call VecSetSizes(feP,PETSC_DECIDE,ind,perr)
call VecSetFromOptions(feP,perr)
do in = nnod_loc
do jt = 1,mm
ind(1) = 3*((in -1)*max_time_deg + (jt-1))
fval(1) = fe(3*((in -1)*max_time_deg + (jt-1)) +1)
call VecSetValues(feP,1,ind,fval(1),INSERT_VALUES,perr)
ind(1) = 3*((in -1)*max_time_deg + (jt-1)) +1
fval(1) = fe(3*((in -1)*max_time_deg + (jt-1)) +2)
call VecSetValues(feP,1,ind,fval(1),INSERT_VALUES,perr)
ind(1) = 3*((in -1)*max_time_deg + (jt-1)) +2
fval(1) = fe(3*((in -1)*max_time_deg + (jt-1)) +3)
call VecSetValues(feP,1,ind,fval(1),INSERT_VALUES,perr)
enddo
enddo
enddo
call VecAssemblyBegin(feP,perr)
call VecAssemblyEnd(feP,perr)
The vector has 640.000 elements more or less but I am running on a high
performing computer so there shouldn't be memory issues. Does anyone know
where is the problem and how can I fix it?
Thank you,
Francesco Migliorini
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170419/62a56483/attachment.html>
More information about the petsc-users
mailing list