[petsc-users] More stable solver

paul zhang paulhuaizhang at gmail.com
Thu Oct 2 12:52:29 CDT 2014


Hello,

I am modelling a hypersonic computational fluid dynamic problem, where the
matrix A is quite sparse and stiff. The solution is x=[rho_1, rho_2, ..,
rho_11, u, v, w, T], where rho_s is the density for each of the species,
some of which can reach 1.E-5, while some others can be below 1.E-20.
As I ran my case, I found the number of iterations oscillated after being
steady for a long time. For instance, the number of iterations was like 10
for 2000 steps, then it goes up to 50 for no reason. Then the code blows
up. My code has being debugged for months. It should be fine. I was
wondering if there is some other method that can be more stable than FGMRES
or some method similar in terms of stability that I can try. Some code
pieces are attached for checking.

Many thanks,
Paul

void petsc_init(void) {

        vector<Cell>::iterator cit;
        vector<int>::iterator it;

        //Create nonlinear solver context
        KSPCreate(PETSC_COMM_WORLD,&ksp);


VecCreateMPI(PETSC_COMM_WORLD,grid[gid].cellCount*nVars,grid[gid].globalCellCount*nVars,&rhs);
        VecSetFromOptions(rhs);
        VecDuplicate(rhs,&deltaU);
        if (ps_step_max>1) {
                VecDuplicate(rhs,&soln_n);
                VecDuplicate(rhs,&pseudo_delta);
                VecDuplicate(rhs,&pseudo_right);
        }
        VecSet(rhs,0.);
        VecSet(deltaU,0.);

        vector<int> diagonal_nonzeros, off_diagonal_nonzeros;
        int nextCellCount;

        // Calculate space necessary for matrix memory allocation
        for (cit=grid[gid].cell.begin();cit!=grid[gid].cell.end();cit++) {
                nextCellCount=0;
                for (it=(*cit).faces.begin();it!=(*cit).faces.end();it++) {
                        if (grid[gid].face[*it].bc==INTERNAL_FACE) {
                                nextCellCount++;
                        }
                }
                for (int i=0;i<nVars;++i) {
                        diagonal_nonzeros.push_back(
(nextCellCount+1)*nVars);
                        off_diagonal_nonzeros.push_back(
((*cit).ghosts.size())*nVars);
                }
        }

        MatCreateMPIAIJ(
                        PETSC_COMM_WORLD,
                        grid[gid].cellCount*nVars,
                        grid[gid].cellCount*nVars,
                        grid[gid].globalCellCount*nVars,
                        grid[gid].globalCellCount*nVars,
                        0,&diagonal_nonzeros[0],
                        0,&off_diagonal_nonzeros[0],
                        &impOP);



        KSPSetOperators(ksp,impOP,impOP,SAME_NONZERO_PATTERN);
        KSPSetTolerances(ksp,rtol,abstol,1.e15,maxits);
        KSPSetInitialGuessKnoll(ksp,PETSC_TRUE);
        //KSPSetType(ksp,KSPGMRES);
        KSPSetType(ksp,KSPFGMRES);

//KSPGMRESSetOrthogonalization(ksp,KSPGMRESModifiedGramSchmidtOrthogonalization);
        KSPGMRESSetRestart(ksp,30);
        KSPSetFromOptions(ksp);

        return;



<http://gsil.engineering.uky.edu>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141002/7e66e78c/attachment.html>


More information about the petsc-users mailing list