[petsc-users] Union of sequential vecs

Karthikeyan Chockalingam - STFC UKRI karthikeyan.chockalingam at stfc.ac.uk
Mon Dec 12 09:00:43 CST 2022


Thank you Matt and Blaise.

I will try out IS (though I have not it before)

(i)                  Can IS be of different size, on different processors, and still call ISALLGather?

(ii)                Can IS be passed as row indices to MatZeroRowsColumns?

I will look into DMlabels and start a different thread if needed.

Best,
Karthik.



From: Matthew Knepley <knepley at gmail.com>
Date: Saturday, 10 December 2022 at 02:20
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk>
Cc: Mark Adams <mfadams at lbl.gov>, petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Union of sequential vecs
On Fri, Dec 9, 2022 at 6:50 PM Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Thank you Mark and Barry.

@Mark Adams<mailto:mfadams at lbl.gov> I follow you for the most part. Shouldn’t R be an MPI Vector?

Here it goes:

Q[Vec[i]] for all "i" in my local (sequential) "Vec".
Compute number of local nonzeros in Q, call it n.

//Create a MPI Vector R, with local size n


VecCreate(PETSC_COMM_WORLD, &R);

VecSetType(R, VECMPI);

VecSetSizes(R, n, PETSC_DECIDE);


//Populate MPI Vector R with local (sequential) "Vec".


VecGetOwnershipRange(R, &istart, &iend);

local_size = iend - istart; //The local_size should be ‘n’ right?

VecGetArray(R, &values);

  for (i = 0; i < local_size; i++) {

    values[i] = Vec[i];

  }

VecRestoreArray(R, &values);


//Scatter R to all processors


Vec            V_SEQ;
VecScatter     ctx;

VecScatterCreateToAll(R,&ctx,&V_SEQ);

//Remove duplicates in V_SEQ
How can I use PetscSortRemoveDupsInt to remove duplicates in V_SEQ?


Physics behind:
I am reading a parallel mesh, and want to mark all the boundary nodes. I use a local (sequential) Vec to store the boundary nodes for each parallel partition. Hence, local Vecs can end up with duplicate node index among them, which I would like to get rid of when I combine all of them together.

1) Blaise is right you should use an IS, not a Vec, to hold node indices. His solution is only a few lines, so I would use it.

2) I would not recommend doing things this way in the first place. PETSc can manage parallel meshes scalably, marking boundaries using DMLabel objects.

  Thanks,

     Matt

Best,
Karthik.





From: Mark Adams <mfadams at lbl.gov<mailto:mfadams at lbl.gov>>
Date: Friday, 9 December 2022 at 21:08
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>, petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Union of sequential vecs
If your space is pretty compact, eg, (0,12), you could create an MPI vector Q of size 13, say, and each processor can add 1.0 to Q[Vec[i]], for all "i" in my local "Vec".
Then each processor can count the number of local nonzeros in Q, call it n, create a new vector, R, with local size n, then set R[i] = global index of the nonzero for each nonzero in Q, i=0:n.
Do some sort of vec-scatter-to-all with R to get what you want.

Does that work?

Mark


On Fri, Dec 9, 2022 at 3:25 PM Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
That is where I am stuck, I don’t know who to combine them to get Vec = {2,5,7,8,10,11,12}.
I just want them in an MPI vector.

I finally plan to call VecScatterCreateToAll so that all processor gets a copy.

Thank you.

Kind regards,
Karthik.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Friday, 9 December 2022 at 20:04
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Union of sequential vecs

  How are you combining them to get Vec = {2,5,7,8,10,11,12}?

  Do you want the values to remain on the same MPI rank as before, just in an MPI vector?



On Dec 9, 2022, at 2:28 PM, Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:

Hi,

I want to take the union of a set of sequential vectors, each living in a different processor.

Say,
Vec_Seq1 = {2,5,7}
Vec_Seq2 = {5,8,10,11}
Vec_Seq3 = {5,2,12}.

Finally, get the union of all them Vec = {2,5,7,8,10,11,12}.

I initially wanted to create a parallel vector and insert the (sequential vector) values but I do not know, to which index to insert the values to. But I do know the total size of Vec (which in this case is 7).

Any help is much appreciated.

Kind regards,
Karthik.




This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses.



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221212/92bab199/attachment-0001.html>


More information about the petsc-users mailing list