[petsc-users] Union of sequential vecs

Mark Adams mfadams at lbl.gov
Fri Dec 9 18:38:00 CST 2022


On Fri, Dec 9, 2022 at 6:50 PM Karthikeyan Chockalingam - STFC UKRI <
karthikeyan.chockalingam at stfc.ac.uk> wrote:

> Thank you Mark and Barry.
>
>
>
> @Mark Adams <mfadams at lbl.gov> I follow you for the most part. Shouldn’t R
> be an MPI Vector?
>

yes, "R, with local size n" implies an MPI vector.


>
>
> Here it goes:
>
>
>
> Q[Vec[i]] for all "i" in my local (sequential) "Vec".
>

OK, you are calling VecSetValue(Q,... ADD_VALUES
<https://petsc.org/main/docs/manualpages/Sys/ADD_VALUES/>)
VecAsseemblyBegin/End needs to be called and that will do communication.


> Compute number of local nonzeros in Q, call it n.
>

You would use VecGetArray like below to do this count.

You could cache the indices in Q that are non-zero, or redo the loop below.


>
> //Create a MPI Vector R, with local size n
>
>
>
> VecCreate(PETSC_COMM_WORLD, &R);
>
> VecSetType(R, VECMPI);
>
> VecSetSizes(R, *n*, PETSC_DECIDE);
>
>
>
> //Populate MPI Vector R with local (sequential) "Vec".
>
>
>
> VecGetOwnershipRange(R, &istart, &iend);
>
> local_size = iend - istart; //The local_size should be ‘*n*’ right?
>

Well yes, but you are going to need to get the "istart" from Q, not R, to
get the global index in a loop.


> VecGetArray(R, &values);
>

No, you want to get the values from Q and use VecSetValue(R,...

VecGetArray(Q, &values_Q);

VecGetOwnershipRange(R, &istart_R, NULL);
idx = istart_R

>   *for* (i = 0; i < local_size_Q; i++) {
>

No, redo the loop above that you used to count. You skipped this so I can
put it here.
(simpler to just run this loop twice, first count and then set)

If (values_Q[i] != 0) [
   VecSetValue(R, idx++, (PetscScalar) istart_Q+i, INSERT_VALUES
<https://petsc.org/main/docs/manualpages/Sys/INSERT_VALUES/>)
}

>     values[i] = Vec[i];
>


>   }
>
> VecRestoreArray(R, &values);
>
VecRestoreArray(Q, &values_Q);

You don't really need a VecAsseemblyBegin/End (R) here but you can add it
to be clear.

I'm not sure this is correct so you need to debug this and the scatter can
be figured out later.


>
> //Scatter R to all processors
>
>
>
> Vec            V_SEQ;
>
> VecScatter     ctx;
>
>
>
> VecScatterCreateToAll(R,&ctx,&V_SEQ);
>
>
>

I'm not sure how to best do this. Look in the docs or ask in a separate
thread. This thread is busy figuring the first part out.



> //Remove duplicates in V_SEQ
>
> How can I use PetscSortRemoveDupsInt to remove duplicates in V_SEQ?
>
>
>
>
>
> Physics behind:
>
> I am reading a parallel mesh, and want to mark all the boundary nodes. I
> use a local (sequential) Vec to store the boundary nodes for each parallel
> partition. Hence, local Vecs can end up with duplicate node index among
> them, which I would like to get rid of when I combine all of them together.
>

Humm, OK, I'm not sure I get this exactly but yes this is intended to get
each process a global list of (boundary) vertices.
Not super scalable, but if it gets you started then that's great.

Good luck,
Mark


>
>
> Best,
>
> Karthik.
>
>
>
>
>
>
>
>
>
> *From: *Mark Adams <mfadams at lbl.gov>
> *Date: *Friday, 9 December 2022 at 21:08
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *Barry Smith <bsmith at petsc.dev>, petsc-users at mcs.anl.gov <
> petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Union of sequential vecs
>
> If your space is pretty compact, eg, (0,12), you could create an MPI
> vector Q of size 13, say, and each processor can add 1.0 to Q[Vec[i]], for
> all "i" in my local "Vec".
>
> Then each processor can count the number of local nonzeros in Q, call it
> n, create a new vector, R, with local size n, then set R[i] = global index
> of the nonzero for each nonzero in Q, i=0:n.
>
> Do some sort of vec-scatter-to-all with R to get what you want.
>
>
>
> Does that work?
>
>
>
> Mark
>
>
>
>
>
> On Fri, Dec 9, 2022 at 3:25 PM Karthikeyan Chockalingam - STFC UKRI via
> petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> That is where I am stuck, *I don’t know* who to combine them to get Vec =
> {2,5,7,8,10,11,12}.
>
> I just want them in an MPI vector.
>
>
>
> I finally plan to call VecScatterCreateToAll so that all processor gets a
> copy.
>
>
>
> Thank you.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Friday, 9 December 2022 at 20:04
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Union of sequential vecs
>
>
>
>   How are you combining them to get Vec = {2,5,7,8,10,11,12}?
>
>
>
>   Do you want the values to remain on the same MPI rank as before, just in
> an MPI vector?
>
>
>
>
>
>
>
> On Dec 9, 2022, at 2:28 PM, Karthikeyan Chockalingam - STFC UKRI via
> petsc-users <petsc-users at mcs.anl.gov> wrote:
>
>
>
> Hi,
>
>
>
> I want to take the union of a set of sequential vectors, each living in a
> different processor.
>
>
>
> Say,
>
> Vec_Seq1 = {2,5,7}
>
> Vec_Seq2 = {5,8,10,11}
>
> Vec_Seq3 = {5,2,12}.
>
>
>
> Finally, get the union of all them Vec = {2,5,7,8,10,11,12}.
>
>
>
> I initially wanted to create a parallel vector and insert the (sequential
> vector) values but I do not know, to which index to insert the values to.
> But I do know the total size of Vec (which in this case is 7).
>
>
>
> Any help is much appreciated.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
>
>
> This email and any attachments are intended solely for the use of the
> named recipients. If you are not the intended recipient you must not use,
> disclose, copy or distribute this email or any of its attachments and
> should notify the sender immediately and delete this email from your
> system. UK Research and Innovation (UKRI) has taken every reasonable
> precaution to minimise risk of this email or any attachments containing
> viruses or malware but the recipient should carry out its own virus and
> malware checks before opening the attachments. UKRI does not accept any
> liability for any losses or damages which the recipient may sustain due to
> presence of any viruses.
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221209/c5189fc1/attachment.html>


More information about the petsc-users mailing list