[petsc-users] PETSc-MUMPS interface, numeric and symbolic factorisation

Y. Shidi ys453 at cam.ac.uk
Fri May 11 17:58:45 CDT 2018


Thank you for your reply.
> For now, if you give the same Mat object back, it will do what you
> expect.
Sorry, I am confused here.
The same Mat object, is it that I do not destroy the Mat at the
end of iteration?
Moreover, which function should I actually call to put the Mat
object back?
Sorry for being stupid on this.

Kind Regards,
Shidi



On 2018-05-11 18:10, Matthew Knepley wrote:
> On Fri, May 11, 2018 at 1:02 PM, Y. Shidi <ys453 at cam.ac.uk> wrote:
> 
>> Thank you very much for your reply, Barry.
>> 
>>> This is a bug in PETSc. Since you are providing a new matrix
>>> with
>>> the same "state" value as the previous matrix the PC code the
>>> following code
>> So what you mean is that every time I change the value in the
>> matrix,
>> the PETSc only determines if the nonzero pattern change but not the
>> values, and if it is unchanged neither of symbolic and numeric
>> happens.
> 
> No, that is not what Barry is saying.
> 
> PETSc looks at the matrix.
> If the structure has changed, it does symbolic and numeric
> factorization.
> If only values have changes, it does numeric factorization.
> 
> HOWEVER, you gave it a new matrix with accidentally the same state
> marker,
> so it thought nothing had changed. We will fix this by also checking
> the pointer.
> For now, if you give the same Mat object back, it will do what you
> expect.
> 
>   Matt
> 
>> I found the following code:
>> 
>> if (!pc->setupcalled) {
>> ierr = PetscInfo(pc,"Setting up PC for first
>> timen");CHKERRQ(ierr);
>> pc->flag = DIFFERENT_NONZERO_PATTERN;
>> } else if (matstate == pc->matstate) {
>> ierr = PetscInfo(pc,"Leaving PC with identical preconditioner
>> since operator is unchangedn");CHKERRQ(ierr);
>> PetscFunctionReturn(0);
>> } else {
>> if (matnonzerostate > pc->matnonzerostate) {
>> ierr = PetscInfo(pc,"Setting up PC with different nonzero
>> patternn");CHKERRQ(ierr);
>> pc->flag = DIFFERENT_NONZERO_PATTERN;
>> } else {
>> ierr = PetscInfo(pc,"Setting up PC with same nonzero
>> patternn");CHKERRQ(ierr);
>> pc->flag = SAME_NONZERO_PATTERN;
>> }
>> }
>> 
>> and I commend out "else if (matstate == pc->matstate){}", so it
>> will do "Setting up PC with same nonzero patternn"; and it seems
>> work in my case, only "MatFactorNumeric_MUMPS()" is calling in the
>> subsequent iterations. But I am not quite sure, need some more
>> tests.
>> 
>> Thank you very much for your help indeed.
>> 
>> Kind Regards,
>> Shidi
>> 
>> On 2018-05-11 16:13, Smith, Barry F. wrote:
>> On May 11, 2018, at 8:14 AM, Y. Shidi <ys453 at cam.ac.uk> wrote:
>> 
>> Thank you for your reply.
>> 
>> How are you changing the matrix? Do you remember to assemble?
>> I use MatCreateMPIAIJWithArrays() to create the matrix,
>> and after that I call MatAssemblyBegin() and MatAssemblyEnd().
> 
>    If you use MatCreateMPIAIJWithArrays() you don't need to call
>  MatAssemblyBegin() and MatAssemblyEnd().
> 
>> But I actually destroy the matrix at the end of each iteration
>> and create the matrix at the beginning of each iteration.
> 
>     This is a bug in PETSc. Since you are providing a new matrix with
>  the same "state" value as the previous matrix the PC code the
>  following code
>  kicks in:
> 
>    ierr =
> PetscObjectStateGet((PetscObject)pc->pmat,&matstate);CHKERRQ(ierr);
>    ierr = MatGetNonzeroState(pc->pmat,&matnonzerostate);CHKERRQ(ierr);
>    if (!pc->setupcalled) {
>      ierr            = PetscInfo(pc,"Setting up PC for first
>  timen");CHKERRQ(ierr);
>      pc->flag        = DIFFERENT_NONZERO_PATTERN;
>    } else if (matstate == pc->matstate) {
>      ierr = PetscInfo(pc,"Leaving PC with identical preconditioner
>  since operator is unchangedn");CHKERRQ(ierr);
>      PetscFunctionReturn(0);
> 
>  and it returns without refactoring.
> 
>     We need an additional check that the matrix also remains the same.
> 
>      We will also need a test example that reproduces the problem to
>  confirm that we have fixed it.
> 
>    Barry
> 
>> Cheers,
>> Shidi
>> 
>> On 2018-05-11 12:59, Matthew Knepley wrote:
>> On Fri, May 11, 2018 at 7:14 AM, Y. Shidi <ys453 at cam.ac.uk> wrote:
>> Dear Matt,
>> Thank you for your help last time.
>> I want to get more detail about the Petsc-MUMPS factorisation;
>> so I go to look the code "/src/mat/impls/aij/mpi/mumps/mumps.c".
>> And I found the following functions are quite important to
>> the question:
>> PetscErrorCode MatCholeskyFactorSymbolic_MUMPS(Mat F,Mat A,IS
>> r,const MatFactorInfo *info);
>> PetscErrorCode MatFactorNumeric_MUMPS(Mat F,Mat A,const
>> MatFactorInfo *info);
>> PetscErrorCode MatSolve_MUMPS(Mat A,Vec b,Vec x);
>> I print some sentence to trace when these functions are called.
>> Then I test my code; the values in the matrix is changing but the
>> structure stays the same. Below is the output.
>> We can see that at 0th step, all the symbolic, numeric and solve
>> are called; in the subsequent steps only the solve stage is called,
>> the numeric step is not called.
>> How are you changing the matrix? Do you remember to assemble?
>> Matt
>> Iteration 0 Step 0.0005 Time 0.0005
>> [INFO]: Direct Solver setup
>> MatCholeskyFactorSymbolic_MUMPS
>> finish MatCholeskyFactorSymbolic_MUMPS
>> MatFactorNumeric_MUMPS
>> finish MatFactorNumeric_MUMPS
>> MatSolve_MUMPS
>> Iteration 1 Step 0.0005 Time 0.0005
>> MatSolve_MUMPS
>> Iteration 2 Step 0.0005 Time 0.001
>> MatSolve_MUMPS
>> [INFO]: End of program!!!
>> I am wondering if there is any possibility to split the numeric
>> and solve stage (as you mentioned using KSPSolve).
>> Thank you very much indeed.
>> Kind Regards,
>> Shidi
>> On 2018-05-04 21:10, Y. Shidi wrote:
>> Thank you very much for your reply.
>> That is really clear.
>> Kind Regards,
>> Shidi
>> On 2018-05-04 21:05, Matthew Knepley wrote:
>> On Fri, May 4, 2018 at 3:54 PM, Y. Shidi <ys453 at cam.ac.uk> wrote:
>> Dear Matt,
>> Thank you very much for your reply!
>> So what you mean is that I can just do the KSPSolve() every
>> iteration
>> once the MUMPS is set?
>> Yes.
>> That means inside the KSPSolve() the numerical factorization is
>> performed. If that is the case, it seems that the ksp object is
>> not changed when the values in the matrix are changed.
>> Yes.
>> Or do I need to call both KSPSetOperators() and KSPSolve()?
>> If you do SetOperators, it will redo the factorization. If you do
>> not,
>> it will look
>> at the Mat object, determine that the structure has not changed,
>> and
>> just redo
>> the numerical factorization.
>> Thanks,
>> Matt
>> On 2018-05-04 14:44, Matthew Knepley wrote:
>> On Fri, May 4, 2018 at 9:40 AM, Y. Shidi <ys453 at cam.ac.uk> wrote:
>> Dear PETSc users,
>> I am currently using MUMPS to solve linear systems directly.
>> Generally, we use ICNTL(7) or ICNTL(29) to do the preprocessing
>> step and then solve the system.
>> In my code, the values in the matrix is changed in each iteration,
>> but the structure of the matrix stays the same, which means the
>> performance can be improved if symbolic factorisation is only
>> performed once. Hence, it is necessary to split the symbolic
>> and numeric factorisation. However, I cannot find a specific step
>> (control parameter) to perform the numeric factorisation.
>> I have used ICNTL(3) and ICNTL(4) to print the MUMPS information,
>> it seems that the symbolic and numeric factorisation always perform
>> together.
>> If you use KSPSolve instead, it will automatically preserve the
>> symbolic
>> factorization.
>> Thanks,
>> Matt
>> So I am wondering if anyone has an idea about it.
>> Below is how I set up MUMPS solver:
>> PC pc;
>> PetscBool flg_mumps, flg_mumps_ch;
>> flg_mumps = PETSC_FALSE;
>> flg_mumps_ch = PETSC_FALSE;
>> PetscOptionsGetBool(NULL, NULL, "-use_mumps_lu", &flg_mumps,
>> NULL);
>> PetscOptionsGetBool(NULL, NULL, "-use_mumps_ch", &flg_mumps_ch,
>> NULL);
>> if(flg_mumps ||flg_mumps_ch)
>> {
>> KSPSetType(_ksp, KSPPREONLY);
>> PetscInt ival,icntl;
>> PetscReal val;
>> KSPGetPC(_ksp, &pc);
>> /// Set preconditioner type
>> if(flg_mumps)
>> {
>> PCSetType(pc, PCLU);
>> }
>> else if(flg_mumps_ch)
>> {
>> MatSetOption(A, MAT_SPD, PETSC_TRUE);
>> PCSetType(pc, PCCHOLESKY);
>> }
>> PCFactorSetMatSolverPackage(pc, MATSOLVERMUMPS);
>> PCFactorSetUpMatSolverPackage(pc);
>> PCFactorGetMatrix(pc, &_F);
>> icntl = 7; ival = 0;
>> MatMumpsSetIcntl( _F, icntl, ival );
>> MatMumpsSetIcntl(_F, 3, 6);
>> MatMumpsSetIcntl(_F, 4, 2);
>> }
>> KSPSetUp(_ksp);
>> Kind Regards,
>> Shidi
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which
>> their experiments lead.
>> -- Norbert Wiener
>> https://www.cse.buffalo.edu/~knepley/ [1] [1] [1] [1]
>> Links:
>> ------
>> [1] http://www.caam.rice.edu/~mk51/ [2] [2] [2]
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which
>> their experiments lead.
>> -- Norbert Wiener
>> https://www.cse.buffalo.edu/~knepley/ [1] [1] [2]
>> Links:
>> ------
>> [1] https://www.cse.buffalo.edu/~knepley/ [1] [1]
>> [2] http://www.caam.rice.edu/~mk51/ [2] [2]
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which
>> their experiments lead.
>> -- Norbert Wiener
>> https://www.cse.buffalo.edu/~knepley/ [1] [2]
>> Links:
>> ------
>> [1] https://www.cse.buffalo.edu/~knepley/ [1]
>> [2] http://www.caam.rice.edu/~mk51/ [2]
> 
> --
> 
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ [2]
> 
> 
> Links:
> ------
> [1] https://www.cse.buffalo.edu/~knepley/
> [2] http://www.caam.rice.edu/~mk51/


More information about the petsc-users mailing list