<div dir="ltr">Nevermind, it was my fault in thinking the error was with u_abs and not u. I switched from local array based value setting for initial conditions to VecSetValues when converting the uniprocessor example to an MPI program. While I removed VecRestoreArray and swapped u_local[*ptr] assignments with VecSetValues, I missed out on adding VecAssembleBegin/End to compensate. Thanks for pointing out that the error was with u and not u_abs. <br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Jan 17, 2019 at 1:29 PM Sajid Ali <<a href="mailto:sajidsyed2021@u.northwestern.edu">sajidsyed2021@u.northwestern.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr">As requested :<br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail-m_-3267192347639494088gmail_attr">[sajid@xrm free_space]$ ./ex_modify <br>Solving a linear TS problem on 1 processor<br>m : 256, slices : 1000.000000, lambda : 1.239800e-10<br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Object is in wrong state<br>[0]PETSC ERROR: Not for unassembled vector<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Development GIT revision: 0cd88d33dca7e1f18a10cbb6fcb08f83d068c5f4 GIT Date: 2019-01-06 13:27:26 -0600<br>[0]PETSC ERROR: ./ex_modify on a named xrm by sajid Thu Jan 17 13:29:12 2019<br>[0]PETSC ERROR: Configure options --prefix=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/petsc-develop-2u6vuwagkoczyvnpsubzrubmtmpfhhkj --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 CFLAGS= FFLAGS= CXXFLAGS= --with-cc=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpicc --with-cxx=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpic++ --with-fc=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/mpich-3.3-z5uiwmx24jylnivuhlnqjjmm674ozj6x/bin/mpif90 --with-precision=double --with-scalar-type=complex --with-shared-libraries=1 --with-debugging=1 --with-64-bit-indices=0 --with-debugging=%s --with-blaslapack-lib="/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_intel_lp64.so /raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_sequential.so /raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/intel-mkl-2019.0.117-wzqlcijwx7odz2x5chembudo5leqpfh2/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64/libmkl_core.so /lib64/libpthread.so /lib64/libm.so /lib64/libdl.so" --with-x=1 --with-clanguage=C --with-scalapack=0 --with-metis=1 --with-metis-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/metis-5.1.0-nhgzn4kjskctzmzv35mstvd34nj2ugek --with-hdf5=1 --with-hdf5-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/hdf5-1.10.4-ltstvsxvyjue2gxfegi4nvr6c5xg3zww --with-hypre=0 --with-parmetis=1 --with-parmetis-dir=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/parmetis-4.0.3-hw3j2ss7mjsc5x5f2gaflirnuufzptil --with-mumps=0 --with-trilinos=0 --with-cxx-dialect=C++11 --with-superlu_dist-include=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/superlu-dist-develop-cpspq4ca2hnyvhx4mz7zsupbj3do6md3/include --with-superlu_dist-lib=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/superlu-dist-develop-cpspq4ca2hnyvhx4mz7zsupbj3do6md3/lib/libsuperlu_dist.a --with-superlu_dist=1 --with-suitesparse=0 --with-zlib-include=/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/zlib-1.2.11-ldu43taplg2nbkxtem346zq4ibhad64i/include --with-zlib-lib="-L/raid/home/sajid/packages/spack/opt/spack/linux-rhel7-x86_64/gcc-7.3.0/zlib-1.2.11-ldu43taplg2nbkxtem346zq4ibhad64i/lib -lz" --with-zlib=1<br>[0]PETSC ERROR: #1 VecCopy() line 1571 in /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/vec/vec/interface/vector.c<br>[0]PETSC ERROR: #2 Monitor() line 296 in /raid/home/sajid/packages/xwp_petsc/1d/free_space/ex_modify.c<br>[0]PETSC ERROR: #3 TSMonitor() line 3929 in /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/ts/interface/ts.c<br>[0]PETSC ERROR: #4 TSSolve() line 3843 in /tmp/sajid/spack-stage/spack-stage-nwxY3Q/petsc/src/ts/interface/ts.c<br>[0]PETSC ERROR: #5 main() line 188 in /raid/home/sajid/packages/xwp_petsc/1d/free_space/ex_modify.c<br>[0]PETSC ERROR: No PETSc Option Table entries<br>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0<br>[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=73<br>:<br>system msg for write_line failure : Bad file descriptor<br><br></div></div></div></div>
</blockquote></div><br clear="all"><br>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div style="font-size:12.8px">Sajid Ali<br></div><div style="font-size:12.8px">Applied Physics<br></div><div style="font-size:12.8px">Northwestern University</div></div></div>