[petsc-users] question about MatCreateRedundantMatrix

Povolotskyi, Mykhailo mpovolot at purdue.edu
Thu Sep 19 13:03:15 CDT 2019


Hello Jose,

I have done the test case to reproduce my error.

1. You will need to download a file "matrix.bin"  from the following link

https://www.dropbox.com/s/6y7ro99ou4qr8uy/matrix.bin?dl=0

2. Here is the C++ code I use

#include <iostream>
#include <cstring>
#include <vector>
#include <complex>
#include <fstream>
#include <sys/stat.h>

#include "mpi.h"
#include "petscmat.h"
#include "slepcsys.h"
#include "slepceps.h"
#include "slepcrg.h"
using namespace std;

void read_matrix(const std::string& filename,
          int& matrix_size,
          std::vector<std::complex<double> >& data)
{

   int file_size;

   struct stat results;

   if (stat(filename.c_str(), &results) == 0)
   {
     file_size = results.st_size;
   }
   else
   {
     throw runtime_error("Wrong file\n");
   }

   int data_size =  file_size / sizeof(std::complex<double>);

   int n1 = (int) sqrt(data_size);

   if (n1  * n1 == data_size)
   {
     matrix_size = n1;
   }
   else
   {
      throw runtime_error("Wrong file size\n");
   }

   data.resize(matrix_size*matrix_size);

   ifstream myFile (filename.c_str(), ios::in | ios::binary);
   myFile.read ((char*) data.data(), file_size);


}


int main(int argc, char* argv[])
{
   MPI_Init(NULL, NULL);
   PetscInitialize(&argc, &argv,(char*)0, (char* )0);
   SlepcInitialize(&argc, &argv,(char*)0, (char* )0);

   int rank;

   MPI_Comm_rank(MPI_COMM_WORLD, &rank);
   string filename("matrix.bin");


   int matrix_size;

   std::vector<std::complex<double> > data;

   read_matrix(filename, matrix_size,  data);
   if (rank == 0)
   {
     cout << "matrix size " << matrix_size << "\n";
   }
   Mat mat;

MatCreateDense(MPI_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,matrix_size,matrix_size,NULL,&mat);
   int local_row_begin;
   int local_row_end;

MatGetOwnershipRange(mat,&local_row_begin,&local_row_end);

   for (int i = local_row_begin; i < local_row_end; i++)
   {
     vector<complex<double> > v(matrix_size);
     vector<int> index(matrix_size);
     for (int j = 0; j < matrix_size; j++)
     {
       v[j] = data[j*matrix_size + i];
       index[j] = j;
     }
     MatSetValues(mat,1,&i,matrix_size,index.data(), v.data(), 
INSERT_VALUES);
   }

   MatAssemblyBegin(mat,MAT_FINAL_ASSEMBLY);
   MatAssemblyEnd(mat,MAT_FINAL_ASSEMBLY);



   complex<double> center(0, 0);
   double  radius(100);
   double vscale(1.0);

   EPS            eps;
   EPSType        type;
   RG             rg;

   EPSCreate(MPI_COMM_WORLD,&eps);
   EPSSetOperators( eps,mat,NULL);
   EPSSetType(eps,EPSCISS);
   EPSSetProblemType(eps, EPS_NHEP);

   EPSSetFromOptions(eps);

   EPSGetRG(eps,&rg);
   RGSetType(rg,RGELLIPSE);
   RGEllipseSetParameters( rg, center,radius,vscale);
   EPSSolve(eps);

   if (rank == 0)
   {
     int nconv;
     EPSGetConverged(eps,&nconv);
     for (int i = 0; i < nconv; i++)
     {
       complex<double> a,b;
       EPSGetEigenvalue(eps,i,&a,&b);;
       cout << a << "\n";
     }
   }

   PetscFinalize();
   SlepcFinalize();
   MPI_Finalize();
}

3. If I run it as mpiexec -n 1 a.out -eps_ciss_partitions 1 it works well.
If run it as  mpiexec -n 2 a.out -eps_ciss_partitions 2
I get an error message
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Mat type seqdense
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018
[0]PETSC ERROR: a.out on a linux-complex named 
brown-fe03.rcac.purdue.edu by mpovolot Thu Sep 19 14:02:06 2019
[0]PETSC ERROR: Configure options --with-scalar-type=complex --with-x=0 
--with-hdf5 --download-hdf5=1 --with-single-library=1 --with-pic=1 
--with-shared-libraries=0 --with-log=0 --with-clanguage=C++ 
--CXXFLAGS="-fopenmp -fPIC" --CFLAGS="-fopenmp -fPIC" --with-fortran=0 
--FFLAGS="-fopenmp -fPIC" --with-debugging=0 --with-cc=mpicc 
--with-fc=mpif90 --with-cxx=mpicxx COPTFLAGS= CXXOPTFLAGS= FOPTFLAGS= 
--download-metis=1 --download-parmetis=1 
--with-valgrind-dir=/apps/brown/valgrind/3.13.0_gcc-4.8.5 
--download-mumps=1 --with-fortran-kernels=0 --download-superlu_dist=1 
--with-blaslapack-lib="-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core " 
--with-blacs-lib=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so 
--with-blacs-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include 
--with-scalapack-lib="-Wl,-rpath,/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_gf_lp64 -lmkl_gnu_thread -lmkl_core  -lpthread 
-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
--with-scalapack-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include
[0]PETSC ERROR: #1 MatCreateMPIMatConcatenateSeqMat() line 10547 in 
/depot/kildisha/apps/brown/nemo5/libs/petsc/build-cplx/src/mat/interface/matrix.c
[0]PETSC ERROR: #2 MatCreateRedundantMatrix() line 10080 in 
/depot/kildisha/apps/brown/nemo5/libs/petsc/build-cplx/src/mat/interface/matrix.c
[0]PETSC ERROR: #3 CISSRedundantMat() line 105 in 
/depot/kildisha/apps/brown/nemo5/libs/slepc/build-cplx/src/eps/impls/ciss/ciss.c
[0]PETSC ERROR: #4 EPSSetUp_CISS() line 862 in 
/depot/kildisha/apps/brown/nemo5/libs/slepc/build-cplx/src/eps/impls/ciss/ciss.c
[0]PETSC ERROR: #5 EPSSetUp() line 165 in 
/depot/kildisha/apps/brown/nemo5/libs/slepc/build-cplx/src/eps/interface/epssetup.c
[0]PETSC ERROR: #6 EPSSolve() line 135 in 
/depot/kildisha/apps/brown/nemo5/libs/slepc/build-cplx/src/eps/interface/epssolve.c

Thank you,
Michael.

On 09/19/2019 04:55 AM, Jose E. Roman wrote:
> Michael,
>
> In my previous email I should have checked it better. The CISS solver works indeed with dense matrices:
>
> $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 -mat_type dense
>
> 2-D Laplacian Eigenproblem, N=900 (30x30 grid)
>
>   Solution method: ciss
>
>   Number of requested eigenvalues: 1
>   Found 15 eigenvalues, all of them computed up to the required tolerance:
>       1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 1.13426,
>       1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413
>
>
> There might be something different in the way matrices are initialized in your code. Send me a simple example that reproduces the problem and I will track it down.
>
> Sorry for the confusion.
> Jose
>
>
>
>> El 19 sept 2019, a las 6:20, hong--- via petsc-users <petsc-users at mcs.anl.gov> escribió:
>>
>> Michael,
>> We have support of MatCreateRedundantMatrix for dense matrices. For example, petsc/src/mat/examples/tests/ex9.c:
>> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>>
>> Hong
>>
>> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users <petsc-users at mcs.anl.gov> wrote:
>> Dear Petsc developers,
>>
>> I found that MatCreateRedundantMatrix does not support dense matrices.
>>
>> This causes the following problem: I cannot use CISS eigensolver from
>> SLEPC with dense matrices with parallelization over quadrature points.
>>
>> Is it possible for you to add this support?
>>
>> Thank you,
>>
>> Michael.
>>
>>
>> p.s. I apologize if you received this e-mail twice, I sent if first from
>> a different address.
>>



More information about the petsc-users mailing list