<div dir="ltr"><div dir="ltr"><div>Hi,</div><div><br></div><div>I can comment on 2):</div><div>Petsc does not directly allow to set the minimum iterations, but you can define your own Convergence Routine (See the manual).</div><div>You can define the routine similar to this (Fortran) code:</div><div><br></div><div>SUBROUTINE MyConvergenceTest(ksp, it, rnorm, reason, ctx, ierr)                 <br>  use PetscInterfaces                                                           <br>  use Cool_petsc                                                                <br>  use Cool_data                                                                 <br>  implicit none                                                                 <br>  type(tKSP) :: ksp                                                             <br>  PetscInt it                                                                   <br>  PetscFortranAddr ctx                                                          <br>  PetscErrorCode ierr                                                           <br>  PetscReal rnorm                                                               <br>  KSPConvergedReason :: reason <br></div><div>                                                 <br>  call KSPConvergedDefault(ksp,it,rnorm,reason,ctx,ierr)                        <br>  IF(it.lt.cool_minits) THEN                                                    <br>    ierr = 0                                                                    <br>    reason = 0                                                                  <br>  END IF                                                                        <br>END SUBROUTINE MyConvergenceTest                                                <br></div><div><br></div><div>You might need to translate that to C in your case and don't forget tell to tell petsc about your custom Convergence test.</div><div>Cheers, Manuel<br></div></div></div><br><div class="gmail_quote"><div dir="ltr">Am Fr., 16. Nov. 2018 um 10:23 Uhr schrieb Sal Am via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>>:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Hi,</div><div><br></div><div>I have a few questions:<br></div><div><br></div><div>1. The following issue/misunderstanding:</div><div style="margin-left:40px">My code reads in two files one PETSc vector and one PETSc matrix (b and A from Ax=b, size ~65000x65000).</div><div style="margin-left:40px">and then calls KSP solver to solve it by running the following in the terminal:</div><div style="margin-left:40px"><br></div><div style="margin-left:40px">mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu -pc_factor_mat_solver mumps</div><div style="margin-left:40px"><br></div><div style="margin-left:40px">Now mumps is supposed to work in parallel and complex, but the code is not solved in parallel it seems. It just prints the result twice. Adding -log_view gives me <br></div><div style="margin-left:40px"><br></div><div style="margin-left:40px">"./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice.</div><br></div><div dir="ltr">2. Using iterative solvers, I am having difficulty getting convergence. I found that there is a way to set the maximum number of iterations, but is there a minimum I can increase? <br></div><div dir="ltr"><br></div><div>3. The residual is not computed when using direct external solvers. What is proper PETSc way of doing this?</div><div>-----------------------------------------------------The code--------------------------------------- <br></div><div>#include <petscksp.h><br>#include <iostream><br>int main(int argc,char **args)<br>{<br>  Vec            x,b;      /* approx solution, RHS */<br>  Mat            A;            /* linear system matrix */<br>  KSP            ksp;         /* linear solver context */<br>  PetscReal      norm;         /* norm of solution error */<br>  PC             pc;<br>  PetscMPIInt    rank, size;<br>  PetscViewer    viewer;<br>  PetscInt       its, i;<br>  PetscErrorCode ierr;<br>  PetscScalar    *xa;<br>  PetscBool      flg = PETSC_FALSE;<br><br>  ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr;<br>  MPI_Comm_rank(PETSC_COMM_WORLD,&rank);<br>  MPI_Comm_size(PETSC_COMM_WORLD,&size);<br>  <br>  #if !defined(PETSC_USE_COMPLEX)<br>    SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers");<br>  #endif<br>  /* <br>   - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br>         Compute the matrix and right-hand-side vector that define<br>         the linear system, Ax = b.<br>     - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - <br>  */<br>  ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from Vector_b.dat ...\n");CHKERRQ(ierr);<br>  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);<br>  ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr);<br>  ierr = VecLoad(b,viewer); CHKERRQ(ierr);<br><br>  ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from Matrix_A.dat ...\n");CHKERRQ(ierr);<br>  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);<br>  ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);<br>  ierr = MatLoad(A,viewer);CHKERRQ(ierr);<br>  ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);<br><br>  ierr = VecDuplicate(b,&x);CHKERRQ(ierr);<br><br>  /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br>                Create the linear solver and set various options<br>     - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - <br>  */<br>  PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n");<br>  ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);<br><br><br>  PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n");<br>  ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr);<br></div><div><br>  ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr);<br>  ierr = PCSetType(pc, PCLU);CHKERRQ(ierr);<br>  ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);<br><br>  /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br>                      Solve the linear system<br>     - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */<br>  ierr = KSPSetUp(ksp);CHKERRQ(ierr);<br>  ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr);<br>  ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr);<br>  PetscPrintf(PETSC_COMM_WORLD, "Solved");<br><br>  /*<br>     Free work space.  All PETSc objects should be destroyed when they<br>     are no longer needed.<br>  */<br>  ierr = KSPDestroy(&ksp);CHKERRQ(ierr);<br>  ierr = VecDestroy(&x);CHKERRQ(ierr);<br>  ierr = VecDestroy(&b);CHKERRQ(ierr); <br>  ierr = MatDestroy(&A);CHKERRQ(ierr);<br>  ierr = PetscFinalize();<br>  return ierr;<br>}</div><div><br></div><div><br></div><div>Kind regards,</div><div>Sal<br></div></div></div>
</blockquote></div><br clear="all"><br>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><span><div><div dir="ltr"><div>Manuel Jung, Dr. rer. nat.</div><div>Hamburger Sternwarte<br> Universität Hamburg<br> Gojenbergsweg 112<br> 21029 Hamburg </div></div></div></span></div></div>