<div dir="ltr"><div dir="ltr">On Mon, Feb 18, 2019 at 8:43 PM Randall Mackie via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">We are investigating the use of porting our PETSc code to work on GPUS.<br>
We have made some good progress but are encountering and error when specifying ASM as the preconditioner.<br>
<br>
Specifically, we set:<br>
<br>
-ksp_type bcgs<br>
-pc_type asm<br>
-sub_pc_type ilu<br>
<br>
That bombs out with the message below.<br></blockquote><div><br></div><div>It certainly looks like BJACOBI propagates the matrix type correctly to submatrices, and ASM does not.</div><div>Patrick fixed this for BJacobi I believe. Patrick, do you remember what ChangeSet fixed it?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
Residual norms for em_ solve.<br>
0 KSP preconditioned resid norm 9.464272188534e+06 true resid norm 2.587899098807e+02 ||r(i)||/||b|| 1.000000000000e+00<br>
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[0]PETSC ERROR: Invalid argument<br>
[0]PETSC ERROR: Object (seq) is not seqcuda or mpicuda<br>
[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018 <br>
[0]PETSC ERROR: /home/everderio/DEV/bin/mtd3fwd_real_debug on a linux-intel-intelmpi-real-debug-cuda_2018 named GPU by root Mon Feb 18 16:46:14 2019<br>
[0]PETSC ERROR: Configure options --with-clean=1 --with-debugging=1 --with-fortran=1 --with-blaslapack-dir=/opt/intel/mkl --with-cuda=1 --with-cudac=/usr/local/cuda-10.0/bin/nvcc<br>
[0]PETSC ERROR: #1 VecCUDAGetArrayReadWrite() line 1210 in /home/everderio/DEV/petsc-3.10.3/src/vec/vec/impls/seq/seqcuda/<a href="http://veccuda2.cu" rel="noreferrer" target="_blank">veccuda2.cu</a><br>
[0]PETSC ERROR: #2 VecScatterCUDA_StoS() line 272 in /home/everderio/DEV/petsc-3.10.3/src/vec/vec/impls/seq/seqcuda/<a href="http://vecscattercuda.cu" rel="noreferrer" target="_blank">vecscattercuda.cu</a><br>
[0]PETSC ERROR: #3 VecScatterBegin_SGToSS_Stride1() line 384 in /home/everderio/DEV/petsc-3.10.3/src/vec/vscat/impls/vscat.c<br>
[0]PETSC ERROR: #4 VecScatterBegin() line 110 in /home/everderio/DEV/petsc-3.10.3/src/vec/vscat/interface/vscatfce.c<br>
[0]PETSC ERROR: #5 PCApply_ASM() line 486 in /home/everderio/DEV/petsc-3.10.3/src/ksp/pc/impls/asm/asm.c<br>
[0]PETSC ERROR: #6 PCApply() line 462 in /home/everderio/DEV/petsc-3.10.3/src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: #7 PCApplyBAorAB() line 691 in /home/everderio/DEV/petsc-3.10.3/src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: #8 KSP_PCApplyBAorAB() line 309 in /home/everderio/DEV/petsc-3.10.3/include/petsc/private/kspimpl.h<br>
[0]PETSC ERROR: #9 KSPSolve_BCGS() line 87 in /home/everderio/DEV/petsc-3.10.3/src/ksp/ksp/impls/bcgs/bcgs.c<br>
[0]PETSC ERROR: #10 KSPSolve() line 780 in /home/everderio/DEV/petsc-3.10.3/src/ksp/ksp/interface/itfunc.c<br>
<br>
<br>
However, using the following options work:<br>
<br>
-ksp_type bcgs<br>
-pc_type bjacobi<br>
-sub_pc_type ilu<br>
<br>
<br>
Thanks, Randy M.<br>
<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>