[petsc-users] MatSetValues vs MatSetValuesBlocked

Nidish nb25 at rice.edu
Mon Aug 10 17:10:31 CDT 2020


It's a 1D model with displacements and rotations as DoFs at each node.

I couldn't find much in the manual on MatSetBlockSize - could you 
provide some more information on its use?

I thought since I've setup the system using DMDACreate1d (I've given 
2dofs per node and a stencil width of 1 there), the matrix object should 
have the nonzero elements preallocated. Here's the call to DMDACreate1d:

   DMDACreate1d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, N, 2, 1, NULL, &mshdm);

Thank you,
Nidish

On 8/10/20 4:55 PM, Jed Brown wrote:
> It looks like you haven't called MatSetBlockSize.  Is this a 1D model with scalar displacement, thus block size of 1?
>
> Nidish <nb25 at rice.edu> writes:
>
>> Hello,
>>
>> I've written a basic FE code of an Euler-Bernoulli Beam (4th order
>> spatial deriv interpolated using Hermite elements).
>>
>> While assembling matrices I noticed something peculiar - if I conducted
>> assembly using MatSetValues it works, while if I did so using
>> MatSetValuesBlocked it throws me an argument out of range error. You can
>> observe this in the attached file in lines 94-95 where I have one
>> version commented out and the other version active.
>>
>> I.O.W., using
>>       MatSetValues(jac, 4, (const PetscInt*)idx, 4, (const
>> PetscInt*)idx,(const PetscScalar*)elstiff, ADD_VALUES);
>> works, while
>>       MatSetValuesBlocked(jac, 4, (const PetscInt*)idx, 4, (const
>> PetscInt*)idx,(const PetscScalar*)elstiff, ADD_VALUES);
>> does NOT work.
>>
>> Also from the documentation I could not really understand what the
>> difference was between these two.
>>
>> Thank you,
>> Nidish
>> static char help[] = "1D Euler-Bernoulli Beam.\n";
>>
>> #include <petscdm.h>
>> #include <petscdmda.h>
>> #include <petscvec.h>
>> #include <petscksp.h>
>> #include <petscmat.h>
>>
>> PetscErrorCode RHSFun(KSP, Vec, void*);
>> PetscErrorCode StiffMat(KSP, Mat, Mat, void*);
>>
>> typedef struct {
>>    PetscScalar A, E, I2, rho, L, F;
>> }Mater;
>>
>> int main(int nargs, char *sargs[])
>> {
>>    PetscErrorCode ierr;
>>    PetscInt N=10;
>>    PetscMPIInt rank, size;
>>    ierr = PetscInitialize(&nargs, &sargs, (char*)0, help);
>>    MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
>>    MPI_Comm_size(PETSC_COMM_WORLD, &size);
>>
>>    PetscOptionsGetInt(NULL, NULL, "-N", &N, NULL);
>>
>>    DM mshdm;
>>    Mater steelbm = {1.0, 2e11, 1.0/12.0, 7800, 3.0, 1e6};
>>    KSP ksp;
>>    
>>    DMDACreate1d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, N, 2, 1, NULL, &mshdm);
>>    DMSetFromOptions(mshdm);
>>    DMSetUp(mshdm);
>>    DMSetApplicationContext(mshdm, &steelbm);  /* User Context */
>>
>>    KSPCreate(PETSC_COMM_WORLD, &ksp);
>>    KSPSetDM(ksp, mshdm);
>>    KSPSetComputeRHS(ksp, RHSFun, &steelbm);
>>    KSPSetComputeOperators(ksp, StiffMat, &steelbm);
>>    KSPSetFromOptions(ksp);
>>    
>>    KSPSolve(ksp, NULL, NULL);
>>
>>    DMDestroy(&mshdm);
>>    KSPDestroy(&ksp);
>>    PetscFinalize();
>>    return 0;
>> }
>>
>> PetscErrorCode RHSFun(KSP ksp, Vec b, void* ctx)
>> {
>>    Mater* steelbm = (Mater*)ctx;
>>    PetscInt N;
>>    PetscErrorCode ierr;
>>
>>    VecGetSize(b, &N);
>>    for (int i=0; i<N; ++i)
>>      VecSetValue(b, i, 0.0, INSERT_VALUES);
>>    VecSetValue(b, N-2, steelbm->F, INSERT_VALUES);
>>    VecAssemblyBegin(b);
>>    ierr = VecAssemblyEnd(b);
>>    
>>    PetscFunctionReturn(0);
>> }
>>
>> PetscErrorCode StiffMat(KSP ksp, Mat J, Mat jac, void* ctx)
>> {
>>    Mater* steelbm = (Mater*)ctx;
>>    PetscInt N, ix, n;
>>    PetscScalar Le, EIbLec;
>>    PetscErrorCode ierr;
>>    DM mshdm;
>>
>>    ierr = KSPGetDM(ksp, &mshdm);
>>    ierr = DMDAGetInfo(mshdm, NULL, &N, NULL, NULL, &n, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL);
>>    DMDAGetCorners(mshdm, &ix, NULL, NULL, &n, NULL, NULL);
>>
>>    Le = steelbm->L/(N-1);
>>    EIbLec = steelbm->E*steelbm->I2/pow(Le, 3);
>>    PetscInt idx[4];
>>    PetscScalar elstiff[4][4] = {{EIbLec*12.0, EIbLec*6*Le, -EIbLec*12.0, EIbLec*6*Le},
>>    			       {EIbLec*6*Le, EIbLec*4*Le*Le, -EIbLec*6*Le, EIbLec*2*Le*Le},
>>    			       {-EIbLec*12.0, -EIbLec*6.0*Le, EIbLec*12.0, -EIbLec*6*Le},
>>    			       {EIbLec*6*Le, EIbLec*2*Le*Le, -EIbLec*6.0*Le, EIbLec*4*Le*Le}};
>>
>>    PetscInt is, ie;
>>    is = ix;  ie = ix+n;
>>    PetscMPIInt rank;
>>    MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
>>    
>>    for (int e=is; e<((ie<N-1)?ie:(N-1)); ++e) {
>>      idx[0] = 2*e; idx[1] = 2*e+1; idx[2] = 2*e+2; idx[3] = 2*e+3;
>>
>>      /* MatSetValuesBlocked(jac, 4, (const PetscInt*)idx, 4, (const PetscInt*)idx, */
>>      /* 			(const PetscScalar*)elstiff, ADD_VALUES); */
>>      MatSetValues(jac, 4, (const PetscInt*)idx, 4, (const PetscInt*)idx,
>>      		 (const PetscScalar*)elstiff, ADD_VALUES);
>>    }
>>    MatAssemblyBegin(jac, MAT_FINAL_ASSEMBLY);
>>    ierr = MatAssemblyEnd(jac, MAT_FINAL_ASSEMBLY);
>>
>>    PetscInt rows[] = {0, 1};
>>    MatZeroRowsColumns(jac, 2, rows, EIbLec*12.0, NULL, NULL);
>>
>>    PetscFunctionReturn(0);
>> }
-- 
Nidish


More information about the petsc-users mailing list