<div dir="auto">Thanks!<div dir="auto">I will try that.</div><div dir="auto">Sincerely,</div><div dir="auto">Huq</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Jan 30, 2019, 8:33 PM Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
<br>
> On Jan 30, 2019, at 9:24 AM, Matthew Knepley via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" rel="noreferrer">petsc-users@mcs.anl.gov</a>> wrote:<br>
> <br>
> On Wed, Jan 30, 2019 at 8:57 AM Fazlul Huq <<a href="mailto:huq2090@gmail.com" target="_blank" rel="noreferrer">huq2090@gmail.com</a>> wrote:<br>
> Thanks Matt.<br>
> <br>
> Is there anyway to go over this problem?<br>
> I need to run program with parallel Cholesky and ILU.<br>
> <br>
> From the link I sent you, you can try MUMPS and PasTiX for Cholesky.<br>
<br>
And look at the -pc_hypre_type option; one of them is for ILU.<br>
<br>
<br>
> <br>
> Matt<br>
> <br>
> Thanks.<br>
> Sincerely,<br>
> Huq<br>
> <br>
> On Wed, Jan 30, 2019 at 7:32 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank" rel="noreferrer">knepley@gmail.com</a>> wrote:<br>
> On Wed, Jan 30, 2019 at 8:25 AM Fazlul Huq via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" rel="noreferrer">petsc-users@mcs.anl.gov</a>> wrote:<br>
> Hello PETSc Developers,<br>
> <br>
> I am trying to run my code with the following commands:<br>
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 6 ./poisson_m -n $x -pc_type hypre -pc_hypre_type boomeramg<br>
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./poisson_m -n $x -pc_type cholesky -log_view<br>
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./poisson_m -n $x -pc_type ilu -log_view<br>
> <br>
> We do not have parallel Cholesky or ILU by default. Here is the table of solvers: <a href="https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html" rel="noreferrer noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html</a><br>
> <br>
> Matt<br>
> <br>
> It works with fine first case (-pc_type hypre -pc_hyper_type boomeramg) but for 2nd and 3rd case I got the following error message: <br>
> <br>
> Solving the problem...[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html" rel="noreferrer noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html</a> for possible LU and Cholesky solvers<br>
> [0]PETSC ERROR: Could not locate a solver package. Perhaps you must ./configure with --download-<package><br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 <br>
> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Wed Jan 30 07:17:40 2019<br>
> [0]PETSC ERROR: Configure options --with-debugging=no --with-64-bit-indices --download-hypre --download-mpich<br>
> [0]PETSC ERROR: #1 MatGetFactor() line 4415 in /home/huq2090/petsc-3.10.2/src/mat/interface/matrix.c<br>
> [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/factor/ilu/ilu.c<br>
> [0]PETSC ERROR: #3 PCSetUp() line 932 in /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c<br>
> [0]PETSC ERROR: #4 KSPSetUp() line 391 in /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: #5 KSPSolve() line 723 in /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: #6 main() line 199 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c<br>
> [0]PETSC ERROR: PETSc Option Table entries:<br>
> [0]PETSC ERROR: -log_view<br>
> [0]PETSC ERROR: -n 9999999<br>
> [0]PETSC ERROR: -pc_type ilu<br>
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>
> application called MPI_Abort(MPI_COMM_WORLD, 92) - process 0<br>
> <br>
> <br>
> Any suggestion?<br>
> Thanks.<br>
> Sincerely,<br>
> Huq<br>
> <br>
> -- <br>
> <br>
> Fazlul Huq<br>
> Graduate Research Assistant<br>
> Department of Nuclear, Plasma & Radiological Engineering (NPRE)<br>
> University of Illinois at Urbana-Champaign (UIUC)<br>
> E-mail: <a href="mailto:huq2090@gmail.com" target="_blank" rel="noreferrer">huq2090@gmail.com</a><br>
> <br>
> <br>
> -- <br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
> <br>
> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br>
> <br>
> <br>
> -- <br>
> <br>
> Fazlul Huq<br>
> Graduate Research Assistant<br>
> Department of Nuclear, Plasma & Radiological Engineering (NPRE)<br>
> University of Illinois at Urbana-Champaign (UIUC)<br>
> E-mail: <a href="mailto:huq2090@gmail.com" target="_blank" rel="noreferrer">huq2090@gmail.com</a><br>
> <br>
> <br>
> -- <br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
> <br>
> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br>
<br>
</blockquote></div>