[petsc-users] Parallel TS for ODE

Francesco Brarda brardafrancesco at gmail.com
Wed Mar 31 09:57:51 CDT 2021


Right now this is only a toy example. I do not expect to see any actual improvement over the number of processors, or should I?

> Il giorno 31 mar 2021, alle ore 16:43, Stefano Zampini <stefano.zampini at gmail.com> ha scritto:
> 
> Are you trying to parallelize a 3 equations system? Or you just use your SIR code to experiment with TS?
> 
> 
>> On Mar 31, 2021, at 5:18 PM, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>> 
>> On Wed, Mar 31, 2021 at 10:15 AM Francesco Brarda <brardafrancesco at gmail.com <mailto:brardafrancesco at gmail.com>> wrote:
>> Thank you for your advices. 
>> I wrote what seems to me a very basic code, but I got this error when I run it with more than 1 processor:
>> Clearly the result 299. is wrong but I do not understand what am doing wrong. With 1 processor it works fine.
>> 
>> My guess is that you do VecGetArray() and index the array using global indices rather than local indices, because
>> there memory corruption with a Vec array.
>> 
>>   Thanks,
>> 
>>      Matt
>>  
>> steps 150, ftime 15.
>> Vec Object: 2 MPI processes
>>   type: mpi
>> Process [0]
>> 16.5613
>> 2.91405
>> Process [1]
>> 299.
>> [0]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_MPI() line 21 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pdvec.c
>> [0]PETSC ERROR: Block [id=0(16)] at address 0x15812a0 is corrupted (probably write past end of array)
>> [0]PETSC ERROR: Block allocated in VecCreate_MPI_Private() line 514 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pbvec.c
>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [0]PETSC ERROR: Memory corruption: https://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind <https://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind>
>> [0]PETSC ERROR: Corrupted memory
>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.14.4, unknown 
>> [0]PETSC ERROR: ./par_sir_model on a arch-debug named srvulx13 by fbrarda Wed Mar 31 16:05:22 2021
>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-openblas-dir=/opt/packages/openblas/0.2.13-gcc --download-mpich PETSC_ARCH=arch-debug
>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 310 in /home/fbrarda/petsc/src/sys/memory/mtr.c
>> [0]PETSC ERROR: #2 VecDestroy_MPI() line 21 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pdvec.c
>> [0]PETSC ERROR: #3 VecDestroy() line 396 in /home/fbrarda/petsc/src/vec/vec/interface/vector.c
>> [0]PETSC ERROR: #4 SNESLineSearchReset() line 284 in /home/fbrarda/petsc/src/snes/linesearch/interface/linesearch.c
>> [0]PETSC ERROR: #5 SNESReset() line 3229 in /home/fbrarda/petsc/src/snes/interface/snes.c
>> [0]PETSC ERROR: #6 TSReset() line 2800 in /home/fbrarda/petsc/src/ts/interface/ts.c
>> [0]PETSC ERROR: #7 TSDestroy() line 2856 in /home/fbrarda/petsc/src/ts/interface/ts.c
>> [0]PETSC ERROR: #8 main() line 256 in par_sir_model.c
>> [0]PETSC ERROR: No PETSc Option Table entries
>> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> application called MPI_Abort(MPI_COMM_SELF, 256001) - process 0
>> [1]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_MPI() line 21 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pdvec.c
>> [1]PETSC ERROR: Block [id=0(16)] at address 0xbd9520 is corrupted (probably write past end of array)
>> [1]PETSC ERROR: Block allocated in VecCreate_MPI_Private() line 514 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pbvec.c
>> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [1]PETSC ERROR: Memory corruption: https://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind <https://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind>
>> [1]PETSC ERROR: Corrupted memory
>> [1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
>> [1]PETSC ERROR: Petsc Release Version 3.14.4, unknown 
>> [1]PETSC ERROR: ./par_sir_model on a arch-debug named srvulx13 by fbrarda Wed Mar 31 16:05:22 2021
>> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-openblas-dir=/opt/packages/openblas/0.2.13-gcc --download-mpich PETSC_ARCH=arch-debug
>> [1]PETSC ERROR: #1 PetscTrFreeDefault() line 310 in /home/fbrarda/petsc/src/sys/memory/mtr.c
>> [1]PETSC ERROR: #2 VecDestroy_MPI() line 21 in /home/fbrarda/petsc/src/vec/vec/impls/mpi/pdvec.c
>> [1]PETSC ERROR: #3 VecDestroy() line 396 in /home/fbrarda/petsc/src/vec/vec/interface/vector.c
>> [1]PETSC ERROR: #4 TSReset() line 2806 in /home/fbrarda/petsc/src/ts/interface/ts.c
>> [1]PETSC ERROR: #5 TSDestroy() line 2856 in /home/fbrarda/petsc/src/ts/interface/ts.c
>> [1]PETSC ERROR: #6 main() line 256 in par_sir_model.c
>> [1]PETSC ERROR: No PETSc Option Table entries
>> [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> application called MPI_Abort(MPI_COMM_SELF, 256001) - process 0
>> 
>>> Il giorno 31 mar 2021, alle ore 12:06, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> ha scritto:
>>> 
>>> On Wed, Mar 31, 2021 at 3:54 AM Francesco Brarda <brardafrancesco at gmail.com <mailto:brardafrancesco at gmail.com>> wrote:
>>> Hi everyone!
>>> 
>>> I am trying to solve a system of 3 ODEs (a basic SIR model) with TS. Sequentially works pretty well, but I need to switch it into a parallel version. 
>>> I started working with TS not very long time ago, there are few questions I’d like to share with you and if you have any advices I’d be happy to hear.
>>> First of all, do I need to use a DM object even if the model is only time dependent? All the examples I found were using that object for the other variable when solving PDEs.
>>> 
>>> You do not need one. We use it in examples because it makes it easy to create the data.
>>>  
>>> When I preallocate the space for the Jacobian matrix, is it better to decide the local or global space?
>>> 
>>> Since you are producing all the Jacobian values, it is whatever is easier in your code I think.
>>> 
>>>   THanks,
>>> 
>>>     Matt
>>>  
>>> Best,
>>> Francesco
>>> 
>>> 
>>> -- 
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
>>> 
>>> https://www.cse.buffalo.edu/~knepley/ <https://www.cse.buffalo.edu/~knepley/>
>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210331/52fbdba8/attachment-0001.html>


More information about the petsc-users mailing list