<div dir="ltr">Apologies for the post. I didn't see that it was for complex vectors only. <br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Apr 22, 2019 at 5:00 PM Sajid Ali <<a href="mailto:sajidsyed2021@u.northwestern.edu">sajidsyed2021@u.northwestern.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Hi, <br><br></div>I see that src/mat/examples/tests/ex112.c is failing for petsc@3.11.1 configured without complex scalars. With complex scalars everything works fine. <br><br></div><div>The error I see is : <br>```<br>[sajid@xrmlite bugfix]$
./ex112 <br>No protocol specified <br> <br> 1-D: FFTW on vector of size 10 <br> Error norm of |x - z| 5.37156 <br> Error norm of |x - z| 5.90871 <br> Error norm of |x - z| 5.96243 <br>[0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- <br>[0]PETSC ERROR: Nonconforming object sizes <br>[0]PETSC ERROR: Mat mat,Vec x: global dim 20 10 <br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting. <br>[0]PETSC ERROR: Petsc Release Version 3.11.1, Apr, 12, 2019 <br>[0]PETSC ERROR: ./ex112 on a named <a href="http://xrmlite.phys.northwestern.edu" target="_blank">xrmlite.phys.northwestern.edu</a> by sajid Mon Apr 22 16:58:41 2019 <br>[0]PETSC ERROR: Configure options --prefix=/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/petsc-3.11.1-5bdbcozu3labtbbi7gtq4xa<br>knay24lo6 --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 CFLAGS="-O2 -march=native" FFLAGS="-O2 -march=native" CXXFLAGS="-<br>O2 -march=native" --with-cc=/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/mpich-3.3-ig4cr2xw2x63bqs5rnmhfshln4iv7av5/bin/mpic<br>c --with-cxx=/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/mpich-3.3-ig4cr2xw2x63bqs5rnmhfshln4iv7av5/bin/mpic++ --with-fc=/h<br>ome/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/mpich-3.3-ig4cr2xw2x63bqs5rnmhfshln4iv7av5/bin/mpif90 --with-precision=double --w<br>ith-scalar-type=real --with-shared-libraries=1 --with-debugging=1 --with-64-bit-indices=0 --with-blaslapack-lib="/home/sajid/packages/spack/opt/spa<br>ck/linux-centos7-x86_64/gcc-8.3.0/intel-mkl-2019.3.199-kzcly5rtcjbkwtnm6tri6kkexnwoat5m/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/li<br>bmkl_intel_lp64.so /home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/intel-mkl-2019.3.199-kzcly5rtcjbkwtnm6tri6kkexnwoat5m/compil<br>ers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_sequential.so /home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/intel-m<br>kl-2019.3.199-kzcly5rtcjbkwtnm6tri6kkexnwoat5m/compilers_and_libraries_2019.3.199/linux/mkl/lib/intel64/libmkl_core.so /lib64/libpthread.so /lib64/<br>libm.so /lib64/libdl.so" --with-x=0 --with-clanguage=C --with-scalapack=0 --with-metis=0 --with-hdf5=0 --with-hypre=0 --with-parmetis=0 --with-mump<br>s=0 --with-trilinos=0 --with-fftw=1 --with-fftw-dir=/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/fftw-3.3.8-mkj4ho2jp6xfrnkl<br>mrvhdfh73woer2s7 --with-superlu_dist=0 --with-suitesparse=0 --with-zlib-include=/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0<br>/zlib-1.2.11-jqrcjdjnrxvouufhjtxbfvfms23fsqpx/include --with-zlib-lib="-L/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/gcc-8.3.0/zlib-1<br>.2.11-jqrcjdjnrxvouufhjtxbfvfms23fsqpx/lib -lz" --with-zlib=1 <br>[0]PETSC ERROR: #1 MatMult() line 2385 in /tmp/sajid/spack-stage/spack-stage-mTD0AV/petsc-3.11.1/src/mat/interface/matrix.c <br>[0]PETSC ERROR: #2 main() line 94 in /home/sajid/packages/aux_xwp_petsc/bugfix/ex112.c <br>[0]PETSC ERROR: No PETSc Option Table entries <br>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov---------- <br>application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0 <br>[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=60 <br>: <br>system msg for write_line failure : Bad file descriptor <br><br>```<br><br></div><div>I came across this because I saw that MatMult was failing for a new test related to a PR I was working on. Is this a bug ? <br></div><div><br></div><div>Thank You,<br></div><div><div dir="ltr" class="gmail-m_4783932509283504846gmail_signature"><div dir="ltr"><div style="font-size:12.8px">Sajid Ali<br></div><div style="font-size:12.8px">Applied Physics<br></div><div style="font-size:12.8px">Northwestern University</div></div></div></div></div></div>
</blockquote></div><br clear="all"><br>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div style="font-size:12.8px">Sajid Ali<br></div><div style="font-size:12.8px">Applied Physics<br></div><div style="font-size:12.8px">Northwestern University</div></div></div>