[petsc-users] Error when Using KSP with Matrix -free

Barry Smith bsmith at mcs.anl.gov
Tue Mar 22 20:49:32 CDT 2016


Run on one process with the option -start_in_debugger noxterm and put a break point in my_mult and then step through to see what it is doing.

  Barry

> On Mar 22, 2016, at 9:52 AM, s1humahd <s1humahd at stmail.uni-bayreuth.de> wrote:
> 
> Thank you very much for your reply,
> now it it is working but the solution is completely wrong. I noticed that the vector "x" is still keeping its initial values whereas it should be changed ( inside my_mult() ) to x=Ab  . It seems to me like my_mult still not working , do you have any
> idea about that? or do I misunderstood something ?
> 
> 
> Best Regards,
> Humam
> 
> On 2016-03-21 18:58, Barry Smith wrote:
>>> On Mar 21, 2016, at 9:51 AM, s1humahd <s1humahd at stmail.uni-bayreuth.de> wrote:
>>> Hello All,
>>> I'm trying to run a very simple program to get familiar with solving linear system using matrix free structure before I use it in my implementation.
>>> I already created a sparse matrix A and a vector b and set values to them.  then I created a free-matrix matrix using MatCreateShell()  and I set the function which provide the operation which currently it is merely multiply matrix A to the vector b, ( it is just an attemp to get familiar,similar to the example ex14f.F in KSP ).
>>> However I'm getting the flowing error when I call the KSPSolve() routine.  I will be grateful if you could help me to recognize the reason for this error. also the related parts of my code is  here.
>>> MatCreate(PETSC_COMM_WORLD,&A);
>>> MatSetSizes(A,nlocal,nlocal,8,8);
>>>            .
>>>            .
>>>            .
>>> VecCreate(PETSC_COMM_WORLD,&b);
>>> VecSetSizes(x,PETSC_DECIDE,8);
>>>            .
>>>            .
>>> MatCreateShell(PETSC_COMM_WORLD,nlocal,nlocal,8,8, (void *)&A,&j_free);
>>> MatShellSetOperation(j_free,MATOP_MULT, (void(*) (void)) (my_mult)(j_free,b,x) );
>>    (void(*) (void)) (my_mult)(j_free,b,x)  should be just (void(*)
>> (void)) my_mult  as written you are actually calling the function
>> my_mult here which returns a 0 (since it ran correctly) and so it is
>> setting a 0 as the operation for MATOP_MULT
>>>            .
>>>            .
>>>            .
>>> KSPCreate(PETSC_COMM_WORLD,&ksp);
>>> KSPSetOperators(ksp,j_free,A);
>>> KSPSetFromOptions(ksp);
>>> KSPSolve(ksp,x,sol);
>>>            .
>>>            .
>>>            .
>>> PetscErrorCode my_mult(Mat j_free, Vec b, Vec x)
>>> {
>>> void *ptr;
>>> Mat *ptr2;
>>> MatShellGetContext(j_free, &ptr);  // the context is matrix A
>>> ptr2 = (Mat*) ptr;
>>> MatMult(*ptr2,b,x);
>>> return 0;
>>> }
>>> The error as follow:
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: This matrix type does not have a multiply defined
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>>> [0]PETSC ERROR: Petsc Release Version 3.6.3, unknown
>>> [0]PETSC ERROR: ./jac_free on a arch-linux2-c-debug named humam-VirtualBox by humam Mon Mar 21 23:37:14 2016
>>> [0]PETSC ERROR: Configure options --download-hdf5=1 --with-blas-lapack-dir=/usr/lib --with-mpi-dir=/usr
>>> [0]PETSC ERROR: #1 MatMult() line 2223 in /home/humam/petsc/src/mat/interface/matrix.c
>>> [0]PETSC ERROR: #2 PCApplyBAorAB() line 727 in /home/humam/petsc/src/ksp/pc/interface/precon.c
>>> [0]PETSC ERROR: #3 KSP_PCApplyBAorAB() line 272 in /home/humam/petsc/include/petsc/private/kspimpl.h
>>> [0]PETSC ERROR: #4 KSPGMRESCycle() line 155 in /home/humam/petsc/src/ksp/ksp/impls/gmres/gmres.c
>>> [0]PETSC ERROR: #5 KSPSolve_GMRES() line 236 in /home/humam/petsc/src/ksp/ksp/impls/gmres/gmres.c
>>> [0]PETSC ERROR: #6 KSPSolve() line 604 in /home/humam/petsc/src/ksp/ksp/interface/itfunc.c
>>> [0]PETSC ERROR: #7 main() line 293 in /home/humam/jac_free.c
>>> [0]PETSC ERROR: No PETSc Option Table entries
>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
>>> Thanks,
>>> Humam



More information about the petsc-users mailing list