[petsc-users] non-contiguous parallel block of the coefficient matrix and AO functions
M. Scot Breitenfeld
brtnfld at uiuc.edu
Mon Oct 4 16:52:03 CDT 2010
I'm a little unclear how to use the AO functions if the global nodal
numbering results in a non-contiguous parallel block of the coefficient
matrix.
For example, if I have 1D (1 dof per node) problem (2 processors), for
this case node number = row position in vector:
0 3 4 7 2 5 6 1 (Apps)
o o o o || o o o o
0 1 2 3 4 5 6 7 (Petsc)
First I call,
CALL AOCreateBasic(PETSC_COMM_WORLD, n, mappings, PETSC_NULL_INTEGER,
ao, ierr)
where
n = 4, mappings=P0:{0,3,4,7}, P1:{2,5,6,1}
Petsc will be continuous P0:{0,1,2,3}, P1:{4,5,6,7} so I used NULL.
CALL AOApplicationToPetsc(ao,n,mappings, ierr)
Now if I want to add a value at the global 7th row (Application row) on
processor 0, do I use the Application's numbering 7, or petsc numbering 3.
If it's the global Application id:
CALL VecSetValues(b, 1, 7 , value, ADD_VALUES, ierr)
Does Petsc know if I specify row 7 to put it in row 3 of petsc numbering?
Or is the numbering always Petsc's numbering?
If I then solve the solution
CALL KSPSolve(ksp,b,b,ierr)
and I want to get the values back
CALL VecGetValues(b,1,row, value, ierr)
is 'row' the Application row (7) or Petsc row (3)
Thanks,
Scot
More information about the petsc-users
mailing list