MatMatMult_MPIDense_MPIDense() works currently?

David Fuentes fuentesdt at gmail.com
Sun Apr 5 16:21:45 CDT 2009



i've been preallocating the full matrix

   const PetscScalar zerotol = 1.e-6;
   PetscInt M,N,m,n;
   ierr = MatGetSize(MatDense,&M,&N);CHKERRQ(ierr);
   ierr = MatGetLocalSize(MatDense,&m,&n);CHKERRQ(ierr);
   ierr = MatCreateMPIAIJ(PETSC_COMM_WORLD,m,n,M,N,n,PETSC_NULL,
                                        N-n,PETSC_NULL,&SparseMat);CHKERRQ(ierr);

and when I convert I try to get only the non-zero entries

   const PetscScalar  *vwork;
   const PetscInt     *cwork;
   PetscInt Istart,Iend, nz;

   ierr=PetscPrintf(PETSC_COMM_WORLD,
                    "  Converting...\n");CHKERRQ(ierr);
   ierr = MatGetOwnershipRange(MatDense,&Istart,&Iend);CHKERRQ(ierr);
   for (PetscInt Ii=Istart; Ii<Iend; Ii++) {
     ierr = MatGetRow(MatDense,Ii,&nz,&cwork,&vwork);CHKERRQ(ierr);
     for (PetscInt Jj=0; Jj<nz; Jj++) {
       //ierr = MatSetValues(SparseMat,1,&Ii,nz,cwork,vwork,INSERT_VALUES);CHKERRQ(ierr);
       if( std::abs(vwork[Jj]) > zerotol ) {
         MatSetValue(SparseMat ,Ii,cwork[Jj],vwork[Jj],INSERT_VALUES);
       }
     }
     ierr = MatRestoreRow(MatDense,Ii,&nz,&cwork,&vwork);CHKERRQ(ierr);
   }

   ierr=PetscPrintf(PETSC_COMM_WORLD,
                    "  Assembling...\n");CHKERRQ(ierr);
   ierr = MatAssemblyBegin(SparseMat ,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
   ierr = MatAssemblyEnd(  SparseMat ,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);


I haven't gotten a chance to look further into it, but I'm not sure if the

       if( std::abs(vwork[Jj]) > zerotol ) {
         MatSetValue(SparseMat ,Ii,cwork[Jj],vwork[Jj],INSERT_VALUES);
       }


helps performance w/ future calls to matmatmult_mpiaij_mpiaij as 
I would suspect...









df











On Sun, 5 Apr 2009, Yujie wrote:

> 
> Dear David:
> 
> Thank you very much for your help. I tried to do this. However, there is not MatConvert_MPIDense() to convert dense
> matrix to aij format. Maybe, the possible method is to create new aij matrix and copy the data into new matrix.
> Thanks.
> 
> Regards,
> 
> Yujie
> 
> 
> On Sun, Apr 5, 2009 at 1:48 PM, David Fuentes <fuentesdt at gmail.com> wrote:
>       Hi Yujie,
> 
>
>       as a work around have you tried converting your dense
>       matrices to aij format and using MatMatMult_MPIAIJ_MPIAIJ()? 
> 
> 
> 
>
>       df
> 
> 
> 
> 
> 
> 
>
>       On Fri, 3 Apr 2009, Yujie wrote:
> 
>
>             Dear Barry:
>
>             I am trying to debug the codes you have written with ex123.c. After commenting the error
>             output
>             (SETERRQ(PETSC_ERR_LIB,"Due to aparent bugs in PLAPACK,this is not currently supported");)
>             in
>             MatMatMultSymbolic_MPIDense_MPIDense(). 
>
>             The errors I got is "Caught signal number 11 SEGV: Segmentation Violation". It takes place
>             in
>              "PLA_Obj_set_to_zero(lu->A);" in 
>
>             MatMPIDenseCopyToPlapack(). 
>
>             To my understanding, if you want to set "lu->A" to zero, you first assign memory to "lu->A".
>             However, I can't find
>             which function you do this in? Could you give me some advice? thanks a lot.
>
>             Regards,
>
>             Yujie
> 
>
>             On Thu, Apr 2, 2009 at 8:02 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>                  On Apr 2, 2009, at 9:19 PM, Yujie wrote:
>
>                        Hi, PETSc Developers
>
>                        I am wondering whether MatMatMult_MPIDense_MPIDense() works currently based on
>             PLAPACK?
>                        Thanks a lot.
> 
>
>              No, if you run it you will see it print an error message.
>
>              I tried to debug PLAPACK to determine the problem but it was awfully complicated and had to
>             give up.
>             Certainly someone else
>             could try to debug PLAPACK to determine the problem. PLAPACK is not supported so
>             unfortunately there is no one
>             to complain to about it and you'd have to fix it yourself.
>
>              Barry
> 
> 
>
>                  Regards,
>
>                  Yujie
> 
> 
> 
> 
> 
> 
>


More information about the petsc-users mailing list