<div dir="ltr">The current version of the MUMPS code in PETSc supports sparse right hand sides for sequential solvers. You can call MatMatSolve(A,X,B) with B of type MATTRANSPOSE, with the inner matrix being a MATSEQAIJ</div><div class="gmail_extra"><br><div class="gmail_quote">2018-05-31 10:37 GMT+03:00 Marius Buerkle <span dir="ltr"><<a href="mailto:mbuerkle@web.de" target="_blank">mbuerkle@web.de</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">The fix for MAT_NEW_NONZERO_LOCATIONS, thanks again.<br>
<br>
I have yet another question, sorry. The recent version of MUMPS supports distributed and sparse RHS is there any chance that this will be supported in PETSc in the near future?<br>
<div class="HOEnZb"><div class="h5"><br>
<br>
<br>
<br>
> On May 30, 2018, at 6:55 PM, Marius Buerkle <<a href="mailto:mbuerkle@web.de">mbuerkle@web.de</a>> wrote:<br>
><br>
> Thanks for the quick fix, I will test it and report back.<br>
> I have another maybe related question, if MAT_NEW_NONZERO_LOCATIONS is true and let's say 1 new nonzero position is created it does not allocated 1 but several new nonzeros but only use 1.<br>
<br>
Correct<br>
<br>
> I think that is normal, right?<br>
<br>
Yes<br>
<br>
> But, at least as far as I understand the manual, a subsequent call of mat assemble with<br>
> MAT_FINAL_ASSEMBLY should compress out the unused allocations and release the memory, is this correct?<br>
<br>
It "compresses it out" (by shifting all the nonzero entries to the beginning of the internal i, j, and a arrays), but does NOT release any memory. Since the values are stored in one big contiguous array (obtained with a single malloc) it cannot just free part of the array, so the extra locations just sit harmlessly at the end if the array unused.<br>
<br>
> If so, this did not work for me, even after doing<br>
> MAT_FINAL_ASSEMBLY the unused nonzero allocations remain. Is this normal?<br>
<br>
Yes,<br>
<br>
Barry<br>
<br>
><br>
>><br>
>> Fixed in the branch barry/fix-mat-new-nonzero-<wbr>locations/maint<br>
>><br>
>> Once this passes testing it will go into the maint branch and then the next patch release but you can use it now in the branch barry/fix-mat-new-nonzero-<wbr>locations/maint<br>
>><br>
>> Thanks for the report and reproducible example<br>
>><br>
>> Barry<br>
>><br>
>><br>
>>> On May 29, 2018, at 7:51 PM, Marius Buerkle <<a href="mailto:mbuerkle@web.de">mbuerkle@web.de</a>> wrote:<br>
>>><br>
>>> Sure, I made a small reproducer, it is Fortran though I hope that is ok. If MAT_NEW_NONZERO_LOCATIONS is set to false I get an error, if it is set to true the new nonzero element is inserted, if MAT_NEW_NONZERO_LOCATIONS is false and either MAT_NEW_NONZERO_LOCATION_ERR or MAT_NEW_NONZERO_ALLOCATION_ERR is set to false afterwards then the new nonzero is also created without an error, but if MAT_NEW_NONZERO_LOCATIONS is set to false after MAT_NEW_NONZERO_LOCATION_ERR/<wbr>MAT_NEW_NONZERO_ALLOCATION_ERR have been set to false I get an error again.<br>
>>><br>
>>><br>
>>> program newnonzero<br>
>>> #include <petsc/finclude/petscmat.h><br>
>>> use petscmat<br>
>>> implicit none<br>
>>><br>
>>> Mat :: A<br>
>>> PetscInt :: dnnz,onnz,n,m,idxm(1),idxn(1),<wbr>nl1,nl2<br>
>>> PetscScalar :: v(1)<br>
>>> PetscReal :: info(MAT_INFO_SIZE)<br>
>>> PetscErrorCode :: ierr<br>
>>><br>
>>> integer :: nproc,iproc,i<br>
>>><br>
>>> call PetscInitialize(PETSC_NULL_<wbr>CHARACTER,ierr)<br>
>>><br>
>>> call MPI_COMM_SIZE(PETSC_COMM_<wbr>WORLD, nproc,ierr)<br>
>>><br>
>>> call MPI_Comm_rank( PETSC_COMM_WORLD, iproc, ierr )<br>
>>><br>
>>> n=3<br>
>>> m=n<br>
>>> call MatCreateAIJ(PETSC_COMM_WORLD,<wbr>PETSC_DECIDE,PETSC_DECIDE,n,m,<wbr>1,PETSC_NULL_INTEGER,0,PETSC_<wbr>NULL_INTEGER,A,ierr)<br>
>>><br>
>>><br>
>>> call MatGetOwnershipRange(A,nl1,<wbr>nl2,ierr)<br>
>>> do i=nl1,nl2-1<br>
>>> idxn(1)=i<br>
>>> idxm(1)=i<br>
>>> v(1)=1d0<br>
>>> call MatSetValues(A,1,idxn,1,idxm, v,INSERT_VALUES,ierr)<br>
>>> end do<br>
>>> call MatAssemblyBegin(A,MAT_FINAL_<wbr>ASSEMBLY,ierr)<br>
>>> call MatAssemblyEnd(A,MAT_FINAL_<wbr>ASSEMBLY,ierr)<br>
>>><br>
>>> call MatSetOption(A,MAT_NEW_<wbr>NONZERO_LOCATIONS,PETSC_FALSE,<wbr>ierr)<br>
>>> !~ call MatSetOption(A,MAT_NEW_<wbr>NONZERO_LOCATION_ERR,PETSC_<wbr>FALSE,ierr)<br>
>>> !~ call MatSetOption(A,MAT_NEW_<wbr>NONZERO_ALLOCATION_ERR ,PETSC_FALSE,ierr)<br>
>>> !~ call MatSetOption(A,MAT_NEW_<wbr>NONZERO_LOCATIONS,PETSC_FALSE,<wbr>ierr)<br>
>>><br>
>>><br>
>>> idxn(1)=0<br>
>>> idxm(1)=n-1<br>
>>> if ((idxn(1).ge.nl1).and.(idxn(1)<wbr>.le.nl2-1)) then<br>
>>> v(1)=2d0<br>
>>> call MatSetValues(A,1,idxn,1,idxm, v,INSERT_VALUES,ierr)<br>
>>> end if<br>
>>> call MatAssemblyBegin(A,MAT_FINAL_<wbr>ASSEMBLY,ierr)<br>
>>> call MatAssemblyEnd(A,MAT_FINAL_<wbr>ASSEMBLY,ierr)<br>
>>><br>
>>> if ((idxn(1).ge.nl1).and.(idxn(1)<wbr>.le.nl2-1)) then<br>
>>> v(1)=2d0<br>
>>> call MatGetValues(A,1,idxn,1,idxm, v,ierr)<br>
>>> write(6,*) v<br>
>>> end if<br>
>>><br>
>>> call PetscFinalize(ierr)<br>
>>><br>
>>> end program newnonzero<br>
>>><br>
>>><br>
>>><br>
>>> $ mpiexec.hydra -n 3 ./a.out<br>
>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
>>> [0]PETSC ERROR: Argument out of range<br>
>>> [0]PETSC ERROR: Inserting a new nonzero at global row/column (0, 2) into matrix<br>
>>> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
>>> [0]PETSC ERROR: Petsc Release Version 3.9.2, May, 20, 2018<br>
>>> [0]PETSC ERROR: ./a.out on a named tono-hpc1 by marius Wed May 30 09:42:40 2018<br>
>>> [0]PETSC ERROR: Configure options --prefix=/home/marius/prog/<wbr>petsc/3.9.2 --download-elemental=yes --download-metis=yes --download-parmetis=yes --download-mumps=yes --with-scalapack-lib="/home/<wbr>marius/intel/compilers_and_<wbr>libraries_2018.2.199/linux/<wbr>mkl/lib/intel64/libmkl_<wbr>scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>intel_lp64.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>sequential.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>core.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>blacs_intelmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl" --FC=mpiifort --CC=mpicc --CXX=mpicxx --with-scalar-type=complex --with-mpi-dir= --with-blaslapack-lib="/home/<wbr>marius/intel/compilers_and_<wbr>libraries_2018.2.199/linux/<wbr>mkl/lib/intel64/libmkl_<wbr>scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>intel_lp64.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>sequential.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>core.a /home/marius/intel/compilers_<wbr>and_libraries_2018.2.199/<wbr>linux/mkl/lib/intel64/libmkl_<wbr>blacs_intelmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl" --with-cxx-dialect=C++11 --download-superlu_dist=yes --download-ptscotch=yes --with-x --with-debugging=1 --download-superlu=yes --with-mkl_cpardiso=1 --with-mkl_pardiso=1 --with-scalapack=1<br>
>>> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 607 in /home/marius/prog/petsc/petsc-<wbr>3.9.2/src/mat/impls/aij/mpi/<wbr>mpiaij.c<br>
>>> [0]PETSC ERROR: #2 MatSetValues() line 1312 in /home/marius/prog/petsc/petsc-<wbr>3.9.2/src/mat/interface/<wbr>matrix.c<br>
>>> (0.000000000000000E+000,0.<wbr>000000000000000E+000)<br>
>>><br>
>>><br>
>>><br>
>>> Please send complete error message; type of matrix used etc. Ideally code that demonstrates the problem.<br>
>>><br>
>>> Barry<br>
>>><br>
>>><br>
>>>> On May 29, 2018, at 3:31 AM, Marius Buerkle <<a href="mailto:mbuerkle@web.de">mbuerkle@web.de</a>> wrote:<br>
>>>><br>
>>>><br>
>>>> Hi,<br>
>>>><br>
>>>> I tried to set MAT_NEW_NONZERO_LOCATIONS to false, as far as I understood MatSetValues should simply ignore entries which would give rise to new nonzero values not creating a new entry and not cause an error, but I get "[1]PETSC ERROR: Inserting a new nonzero at global row/column". Is this option supposed to work or not?<br>
>>><br>
>><br>
>><br>
<br>
</div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature">Stefano</div>
</div>