[petsc-users] Questions on matrix-free GMRES implementation
feng wang
snailsoar at hotmail.com
Fri Mar 12 08:55:00 CST 2021
Hi Mat,
Thanks for your reply. I will try the parallel implementation.
I've got a serial matrix-free GMRES working, but I would like to know why my initial version of matrix-free implementation does not work and there is still something I don't understand. I did some debugging and find that the callback function to compute the RHS for the matrix-free matrix is called twice by Petsc when it computes the finite difference Jacobian, but it should only be called once. I don't know why, could you please give some advice?
Thanks,
Feng
________________________________
From: Matthew Knepley <knepley at gmail.com>
Sent: 12 March 2021 12:05
To: feng wang <snailsoar at hotmail.com>
Cc: Barry Smith <bsmith at petsc.dev>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Questions on matrix-free GMRES implementation
On Fri, Mar 12, 2021 at 6:02 AM feng wang <snailsoar at hotmail.com<mailto:snailsoar at hotmail.com>> wrote:
Hi Barry,
Thanks for your advice.
You are right on this. somehow there is some inconsistency when I compute the right hand side (true RHS + time-stepping contribution to the diagonal matrix) to compute the finite difference Jacobian. If I just use the call back function to recompute my RHS before I call MatMFFDSetBase, then it works like a charm. But now I end up with computing my RHS three times. 1st time is to compute the true RHS, the rest two is for computing finite difference Jacobian.
In my previous buggy version, I only compute RHS twice. If possible, could you elaborate on your comments "Also be careful about petsc_baserhs", so I may possibly understand what was going on with my buggy version.
Our FD implementation is simple. It approximates the action of the Jacobian as
J(b) v = (F(b + h v) - F(b)) / h ||v||
where h is some small parameter and b is the base vector, namely the one that you are linearizing around. In a Newton step, b is the previous solution
and v is the proposed solution update.
Besides, for a parallel implementation, my code already has its own partition method, is it possible to allow petsc read in a user-defined partition? if not what is a better way to do this?
Sure
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecSetSizes.html
Thanks,
Matt
Many thanks,
Feng
________________________________
From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Sent: 11 March 2021 22:15
To: feng wang <snailsoar at hotmail.com<mailto:snailsoar at hotmail.com>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Questions on matrix-free GMRES implementation
Feng,
The first thing to check is that for each linear solve that involves a new operator (values in the base vector) the MFFD matrix knows it is using a new operator.
The easiest way is to call MatMFFDSetBase() before each solve that involves a new operator (new values in the base vector). Also be careful about petsc_baserhs, when you change the base vector's values you also need to change the petsc_baserhs values to the function evaluation at that point.
If that is correct I would check with a trivial function evaluator to make sure the infrastructure is all set up correctly. For examples use for the matrix free a 1 4 1 operator applied matrix free.
Barry
On Mar 11, 2021, at 7:35 AM, feng wang <snailsoar at hotmail.com<mailto:snailsoar at hotmail.com>> wrote:
Dear All,
I am new to petsc and trying to implement a matrix-free GMRES. I have assembled an approximate Jacobian matrix just for preconditioning. After reading some previous questions on this topic, my approach is:
the matrix-free matrix is created as:
ierr = MatCreateMFFD(*A_COMM_WORLD, iqe*blocksize, iqe*blocksize, PETSC_DETERMINE, PETSC_DETERMINE, &petsc_A_mf); CHKERRQ(ierr);
ierr = MatMFFDSetFunction(petsc_A_mf, FormFunction_mf, this); CHKERRQ(ierr);
KSP linear operator is set up as:
ierr = KSPSetOperators(petsc_ksp, petsc_A_mf, petsc_A_pre); CHKERRQ(ierr); //petsc_A_pre is my assembled pre-conditioning matrix
Before calling KSPSolve, I do:
ierr = MatMFFDSetBase(petsc_A_mf, petsc_csv, petsc_baserhs); CHKERRQ(ierr); //petsc_csv is the flow states, petsc_baserhs is the pre-computed right hand side
The call back function is defined as:
PetscErrorCode cFdDomain::FormFunction_mf(void *ctx, Vec in_vec, Vec out_vec)
{
PetscErrorCode ierr;
cFdDomain *user_ctx;
cout << "FormFunction_mf called\n";
//in_vec: flow states
//out_vec: right hand side + diagonal contributions from CFL number
user_ctx = (cFdDomain*)ctx;
//get perturbed conservative variables from petsc
user_ctx->petsc_getcsv(in_vec);
//get new right side
user_ctx->petsc_fd_rhs();
//set new right hand side to the output vector
user_ctx->petsc_setrhs(out_vec);
ierr = 0;
return ierr;
}
The linear system I am solving is (J+D)x=RHS. J is the Jacobian matrix. D is a diagonal matrix and it is used to stabilise the solution at the start but reduced gradually when the solution moves on to recover Newton's method. I add D*x to the true right side when non-linear function is computed to work out finite difference Jacobian, so when finite difference is used, it actually computes (J+D)*dx.
The code runs but diverges in the end. If I don't do matrix-free and use my approximate Jacobian matrix, GMRES works. So something is wrong with my matrix-free implementation. Have I missed something in my implementation? Besides, is there a way to check if the finite difference Jacobian matrix is computed correctly in a matrix-free implementation?
Thanks for your help in advance.
Feng
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210312/e0627bc3/attachment.html>
More information about the petsc-users
mailing list