[petsc-users] Collect Trajectories components
Francesco Brarda
brardafrancesco at gmail.com
Mon May 31 03:25:12 CDT 2021
Thank you very much for the insights. I don’t believe I was clear enough in my previous message, I am sorry.
The final vector I am trying to build should contain only 1 component of the trajectory (the second one, U[1]) at different times. For instance:
Rank 0:
Second component(t=0)
Second component(t=1)
Second component(t=2)
Rank 1:
Second component(t=3)
Second component(t=4)
Second component(t=5)
Rank 2:
Second component(t=6)
Second component(t=7)
And so on.
Do you think it is possible? Does the vector U need to have specific requirements in terms of dofs or stencil width?
Francesco
> Il giorno 28 mag 2021, alle ore 18:04, Barry Smith <bsmith at petsc.dev> ha scritto:
>
>
> What does "not working as I would like" mean? It should be retrieving the trajectory at the times 1.0, 2.0, 3.0 ... 40.0 and setting into the vector partial the values of the second component of Uloc (which depending on DMDA having a stencil width of 1 and a w of 1 is the first component of U.
>
> You can move the VecGet/RestoreArray(partial,&partlocal);CHKERRQ(ierr); outside of the loop.
>
> If you want the first component of U on process 0 you don't need the Uloc or the GlobalToLocalBegin/End. just use DMDAVecGetArray(appctx.da,U,&Ui);CHKERRQ(ierr);
>
>
> You only provide 14 locations in partial distributed over the MPI ranks but likely you want 40 on the first rank and none on the other ranks
>
> You are assigning part local[i] on all ranks, but you said you only want it on rank 0 so here is code that may work
>
> if rank == 0 {
>> ierr = VecCreateMPI(PETSC_COMM_WORLD,40,PETSC_DETERMINE,&partial);CHKERRQ(ierr); /* 40 local values
>> ierr = VecGetArray(partial,&partlocal);CHKERRQ(ierr);
> } else {
> ierr = VecCreateMPI(PETSC_COMM_WORLD,0,PETSC_DETERMINE,&partial);CHKERRQ(ierr); /* 0 local values
> }
>> for(i=0; i<40; i++) {
>> PetscReal ttime = i+1;
>> ierr = TSTrajectoryGetVecs(appctx.tj,appctx.ts,PETSC_DECIDE,&ttime,U,NULL);CHKERRQ(ierr);
> if rank == 0 {
>> ierr = DMDAVecGetArray(appctx.da,U,&Ui);CHKERRQ(ierr);
>> partlocal[i] = Ui[0];
>> ierr = DMDAVecRestoreArray(appctx.da,U,&Ui);CHKERRQ(ierr);
> }
> }
>> if rank == 0 {
>> ierr = VecRestoreArray(partial,&partlocal);CHKERRQ(ierr);
> }
>
> Note that this entire block of code needs to run on all MPI ranks. But the actually selection of the wanted value only occurs on rank 0
>
> When the loop is done rank == 0 will have a parallel vector whose components are what you want and all the other ranks will have a parallel vector with no components on those ranks. Note that you don't need to make partial be a parallel vector, you can just make it live on rank == 0 because that is the only place you access it. Then the code would be simpler
>
> if rank == 0 {
>> ierr = VecCreateSeq(PETSC_COMM_WORLD,40,PETSC_&partial);CHKERRQ(ierr);
>> ierr = VecGetArray(partial,&partlocal);CHKERRQ(ierr);
> }
>
> for(i=0; i<40; i++) {
>> PetscReal ttime = i+1;
>> ierr = TSTrajectoryGetVecs(appctx.tj,appctx.ts,PETSC_DECIDE,&ttime,U,NULL);CHKERRQ(ierr);
> if rank == 0 {
>> ierr = DMDAVecGetArray(appctx.da,U,&Ui);CHKERRQ(ierr);
>> partlocal[i] = Ui[0];
>> ierr = DMDAVecRestoreArray(appctx.da,U,&Ui);CHKERRQ(ierr);
> }
> }
>> if rank == 0 {
>> ierr = VecRestoreArray(partial,&partlocal);CHKERRQ(ierr);}
>
>
> Barry
>
>
>> On May 27, 2021, at 2:42 AM, Francesco Brarda <brardafrancesco at gmail.com <mailto:brardafrancesco at gmail.com>> wrote:
>>
>> I created a for cycle where I call TSTrajectoryGetVecs, but only the 0 rank seems to enter in this cycle and I do not know why.
>> I thought the following might be a solution, but it is not working as I would like to, i.e. the final vector has the same local parts, a copy of the values obtained with the 0-rank. How should I change this, please?
>>
>> Vec U, partial, Uloc;
>> PetscScalar *Ui, *partlocal;
>> PetscInt i;
>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,14,&partial);CHKERRQ(ierr);
>> for(i=0; i<40; i++) {
>> PetscReal ttime = i+1;
>> ierr = TSTrajectoryGetVecs(appctx.tj,appctx.ts,PETSC_DECIDE,&ttime,U,NULL);CHKERRQ(ierr);
>> ierr = DMGetLocalVector(appctx.da,&Uloc);CHKERRQ(ierr);
>> ierr = DMGlobalToLocalBegin(appctx.da,U,INSERT_VALUES,Uloc);CHKERRQ(ierr);
>> ierr = DMGlobalToLocalEnd(appctx.da,U,INSERT_VALUES,Uloc);CHKERRQ(ierr);
>> ierr = DMDAVecGetArray(appctx.da,Uloc,&Ui);CHKERRQ(ierr);
>> ierr = VecGetArray(partial,&partlocal);CHKERRQ(ierr);
>> partlocal[i] = Ui[1];
>> ierr = DMDAVecRestoreArray(appctx.da,U,&Ui);CHKERRQ(ierr);
>> ierr = VecRestoreArray(partial,&partlocal);CHKERRQ(ierr);
>> ierr = DMRestoreLocalVector(appctx.da,&Uloc);CHKERRQ(ierr);
>> }
>>
>>
>>> Il giorno 27 mag 2021, alle ore 01:15, Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> ha scritto:
>>>
>>>
>>>
>>>> On May 26, 2021, at 10:39 AM, Francesco Brarda <brardafrancesco at gmail.com <mailto:brardafrancesco at gmail.com>> wrote:
>>>>
>>>> Thank you very much.
>>>>> Based on the error message it appears that your code is requesting different times on different MPI ranks. Is that what you intend to do?
>>>> Yes. I want to save different times across a vector built with multiple MPI ranks (PETSC_DECIDE for the local length).
>>>> The function is called only by the first proc (rank=0) and not from the others. Is there a way to force also other ranks to call that routine?
>>>
>>> Yes, just have all ranks call it and ignore the result on the other ranks.
>>>
>>>> Should I build everything into an external function outside the main?
>>>
>>> It can be called in main, does not need to be in a different function.
>>>
>>>>
>>>> Francesco
>>>>
>>>>
>>>>> Il giorno 26 mag 2021, alle ore 16:20, Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> ha scritto:
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> TSTrajectoryGetVecs() is listed as Collective on TS. This means all ranks must call it with the same times in the same order of operations on all ranks that share the TS.
>>>>>
>>>>> You do not need to use VecScatter. Each process must call TSTrajectoryGetVecs with the same time but then you can have only the rank you care about select the entries from the resulting vectors you care about while the other ranks for that time just ignore the vectors since they do not need to values from it.
>>>>>
>>>>> Barry
>>>>>
>>>>>
>>>>>
>>>>>> On May 26, 2021, at 5:20 AM, Francesco Brarda <brardafrancesco at gmail.com <mailto:brardafrancesco at gmail.com>> wrote:
>>>>>>
>>>>>> Hi!
>>>>>>
>>>>>> I solved an ODE system with TS. Now I would like to save one of the trajectories in specific times. To do so, I used TSTrajectoryGetVecs.
>>>>>> The values of the variable I am interested in is on one processor. I want to collect these values in a parallel vector, but I had the error:
>>>>>>
>>>>>> [0]PETSC ERROR: Invalid argument
>>>>>> [0]PETSC ERROR: Real value must be same on all processes, argument # 2
>>>>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
>>>>>> [0]PETSC ERROR: Petsc Release Version 3.14.4, unknown
>>>>>> [0]PETSC ERROR: ./petsc_sir on a arch-debug named srvulx13 by fbrarda Wed May 26 12:00:42 2021
>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-openblas-dir=/opt/packages/openblas/0.2.13-gcc --download-mpich PETSC_ARCH=arch-debug
>>>>>> [0]PETSC ERROR: #1 TSHistoryGetLocFromTime() line 134 in /home/fbrarda/petsc/src/ts/interface/tshistory.c
>>>>>> [0]PETSC ERROR: #2 TSTrajectoryReconstruct_Private() line 55 in /home/fbrarda/petsc/src/ts/trajectory/utils/reconstruct.c
>>>>>> [0]PETSC ERROR: #3 TSTrajectoryGetVecs() line 239 in /home/fbrarda/petsc/src/ts/trajectory/interface/traj.c
>>>>>>
>>>>>> Is there any specific routine I can use to overcome this issue? Should I use VecScatter?
>>>>>>
>>>>>> I hope I made myself clear.
>>>>>> Best,
>>>>>> Francesco
>>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210531/a784cae1/attachment-0001.html>
More information about the petsc-users
mailing list