<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Wed, Nov 28, 2018 at 10:26 AM Sal Am <<a href="mailto:tempohoper@gmail.com">tempohoper@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div>Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems to work.</div><div><br></div><div>Now I am wondering about the other problem namely getting the residual, is the residual only computed when using iterative solvers? Cause using richardson or gmres with 1 iteration while using MUMPS prints me nothing.</div></div></blockquote><div><br></div><div>We had an optimization which did not print the residual with richardson, since it is used often inside of other solvers where</div><div>you do not want this. Are you sure that the residual does not print with GMRES?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="gmail_quote"><div dir="ltr">On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Tue, Nov 27, 2018 at 4:29 AM Sal Am <<a href="mailto:tempohoper@gmail.com" target="_blank">tempohoper@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr">
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>This can happen if you use an 'mpiexec' which is from a different MPI than the one you compiled PETSc with.</div></blockquote><div><br></div><div>that is odd, I tried removing the --download-mpich from the config and tried --with-mpi=1 (which should be default anyways) and retried it with --with-mpich=1.</div></div></div></div></div></div></div></div></blockquote><div><br></div><div>Your MPI is still broken. Some default MPIs, like those installed with Apple, are broken. If you are on Apple</div><div>I would recommend using --download-mpich, but make sure you use $PETSC_DIR/$PETSC_ARCH/bin/mpiexec.</div><div><br></div><div>If you reconfigure, you either have to specify a different --PETSC_ARCH, or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad interaction with the libraries already sitting there.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Current reconfig file: <br></div><div> '--download-mpich',<br> '--download-mumps',<br> '--download-scalapack',<br> '--download-superlu_dist',<br> '--with-cc=gcc',<br> '--with-clanguage=cxx',<br> '--with-cxx=g++',<br> '--with-debugging=no',<br> '--with-fc=gfortran',<br> '--with-mpi=1',<br> '--with-mpich=1',<br> '--with-scalar-type=complex',<br> 'PETSC_ARCH=linux-opt'<br></div><div>Still does not work though, I tried executing ex11 in ksp/ksp/example/tutorial/ex11 which should solve a linear system in parallel by running mpiexec -n 2 but it prints out <br></div><div>Mat Object: 1 MPI processes <br></div><div>...</div><div>..</div><div>twice.</div><div>What am I missing? <br></div><div><br></div><div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>Why would you want a minimum iterations? Why not set a tolerance and a max? What would you achieve with a minimum?</div></blockquote><div><br></div><div>I thought PETSc might not be iterating far enough, but after having had a look at KSPSetTolerances it makes more sense.</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>
Instead of 'preonly', where you do not want a residual, use 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1)
</div></blockquote><div><br></div><div> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1<br></div><div>on the command line. However, nothing prints out on the terminal (aside from PetscPrintf if I have them enabled). <br></div><div><br></div><div>Thank you.<br></div>
</div>
</div></div></div></div></div></div></div><br><div class="gmail_quote"><div dir="ltr">On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Hi,</div><div><br></div><div>I have a few questions:<br></div><div><br></div><div>1. The following issue/misunderstanding:</div><div style="margin-left:40px">My code reads in two files one PETSc vector and one PETSc matrix (b and A from Ax=b, size ~65000x65000).</div><div style="margin-left:40px">and then calls KSP solver to solve it by running the following in the terminal:</div><div style="margin-left:40px"><br></div><div style="margin-left:40px">mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu -pc_factor_mat_solver mumps</div><div style="margin-left:40px"><br></div><div style="margin-left:40px">Now mumps is supposed to work in parallel and complex, but the code is not solved in parallel it seems. It just prints the result twice. Adding -log_view gives me <br></div><div style="margin-left:40px"><br></div><div style="margin-left:40px">"./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice.</div></div></div></div></blockquote><div><br></div><div>This can happen if you use an 'mpiexec' which is from a different MPI than the one you compiled PETSc with.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr">2. Using iterative solvers, I am having difficulty getting convergence. I found that there is a way to set the maximum number of iterations, but is there a minimum I can increase? </div></div></div></blockquote><div><br></div><div>Why would you want a minimum iterations? Why not set a tolerance and a max? What would you achieve with a minimum?</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div>3. The residual is not computed when using direct external solvers. What is proper PETSc way of doing this?</div></div></div></blockquote><div><br></div><div>Instead of 'preonly', where you do not want a residual, use 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1)</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div>-----------------------------------------------------The code--------------------------------------- <br></div><div>#include <petscksp.h><br>#include <iostream><br>int main(int argc,char **args)<br>{<br> Vec x,b; /* approx solution, RHS */<br> Mat A; /* linear system matrix */<br> KSP ksp; /* linear solver context */<br> PetscReal norm; /* norm of solution error */<br> PC pc;<br> PetscMPIInt rank, size;<br> PetscViewer viewer;<br> PetscInt its, i;<br> PetscErrorCode ierr;<br> PetscScalar *xa;<br> PetscBool flg = PETSC_FALSE;<br><br> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr;<br> MPI_Comm_rank(PETSC_COMM_WORLD,&rank);<br> MPI_Comm_size(PETSC_COMM_WORLD,&size);<br> <br> #if !defined(PETSC_USE_COMPLEX)<br> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers");<br> #endif<br> /* <br> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br> Compute the matrix and right-hand-side vector that define<br> the linear system, Ax = b.<br> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - <br> */<br> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from Vector_b.dat ...\n");CHKERRQ(ierr);<br> ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);<br> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr);<br> ierr = VecLoad(b,viewer); CHKERRQ(ierr);<br><br> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from Matrix_A.dat ...\n");CHKERRQ(ierr);<br> ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);<br> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);<br> ierr = MatLoad(A,viewer);CHKERRQ(ierr);<br> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);<br><br> ierr = VecDuplicate(b,&x);CHKERRQ(ierr);<br><br> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br> Create the linear solver and set various options<br> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - <br> */<br> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n");<br> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);<br><br><br> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n");<br> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr);<br></div><div><br> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr);<br> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr);<br> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);<br><br> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -<br> Solve the linear system<br> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */<br> ierr = KSPSetUp(ksp);CHKERRQ(ierr);<br> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr);<br> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr);<br> PetscPrintf(PETSC_COMM_WORLD, "Solved");<br><br> /*<br> Free work space. All PETSc objects should be destroyed when they<br> are no longer needed.<br> */<br> ierr = KSPDestroy(&ksp);CHKERRQ(ierr);<br> ierr = VecDestroy(&x);CHKERRQ(ierr);<br> ierr = VecDestroy(&b);CHKERRQ(ierr); <br> ierr = MatDestroy(&A);CHKERRQ(ierr);<br> ierr = PetscFinalize();<br> return ierr;<br>}</div><div><br></div><div><br></div><div>Kind regards,</div><div>Sal<br></div></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="m_-2919753369969553758m_7480728179759957142m_-1573101523455784037m_-6902888949031528634m_-4563345472827773355gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="m_-2919753369969553758m_7480728179759957142gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>