<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Rochan<div class=""><br class=""></div><div class="">This has been fixed few months ago <a href="https://gitlab.com/petsc/petsc/-/commit/c8f76b2f0ecb94de1c6dd38e490dd0a500501954" class="">https://gitlab.com/petsc/petsc/-/commit/c8f76b2f0ecb94de1c6dd38e490dd0a500501954</a></div><div class=""><br class=""></div><div class="">Here is the relevant code you were mentioning in the new version <a href="https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c#L274" class="">https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c#L274</a></div><div class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Mar 9, 2020, at 10:51 PM, Junchao Zhang via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" class="">petsc-users@mcs.anl.gov</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div dir="ltr" class="">Let me try it. BTW, did you find the same code at <a href="https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c" class="">https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c</a><br clear="all" class=""><div class=""><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr" class="">--Junchao Zhang</div></div></div><br class=""></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Mar 9, 2020 at 2:46 PM Rochan Upadhyay <<a href="mailto:u.rochan@gmail.com" class="">u.rochan@gmail.com</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr" class=""><div class="">Hi Junchao,<br class=""></div><div class="">I doubt if it was fixed as diff of the src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c between master-branch and version 12.4 shows no changes.</div><div class="">I am unable to compile the master version (configure.log attached) but I think you can recreate the problem by running the ex12.c program that</div><div class="">I attached on my previous mail.</div><div class="">Regards,</div><div class="">Rochan<br class=""> </div></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Mar 9, 2020 at 2:22 PM Junchao Zhang <<a href="mailto:jczhang@mcs.anl.gov" target="_blank" class="">jczhang@mcs.anl.gov</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr" class="">Could you try the master branch since it seems Stefano fixed this problem recently?<br clear="all" class=""><div class=""><div dir="ltr" class=""><div dir="ltr" class="">--Junchao Zhang</div></div></div><br class=""></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Mar 9, 2020 at 2:04 PM Rochan Upadhyay <<a href="mailto:u.rochan@gmail.com" target="_blank" class="">u.rochan@gmail.com</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr" class=""><div class="">Dear PETSc Developers,</div><div class=""><br class=""></div><div class="">I am having trouble interfacing SuperLU_Dist as a direct solver for certain problems in PETSc. The problem is that when interfacing with SuperLU_Dist, you need your matrix to be of Type MPISEQAIJ when running MPI with one processor. PETSc has long allowed the use of Matrix type MPIAIJ for all MPI runs, including MPI with a single processor and that is still the case for all of PETSc's native solvers. This however has been broken for the SuperLU_Dist option. The following code snippet (in PETSc and not SuperLU_Dist) is responsible for this restriction and I do not know if it is by design or accident :</div><div class=""><br class=""></div><div class="">In file petsc-3.12.4/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c line 257 onwards :</div><div class=""><br class=""></div><div class="">ierr = MPI_Comm_size(PetscObjectComm((PetscObject)A),&size);CHKERRQ(ierr);<br class=""><br class=""> if (size == 1) {<br class=""> aa = (Mat_SeqAIJ*)A->data;<br class=""> rstart = 0;<br class=""> nz = aa->nz;<br class=""> } else {<br class=""> Mat_MPIAIJ *mat = (Mat_MPIAIJ*)A->data;<br class=""> aa = (Mat_SeqAIJ*)(mat->A)->data;<br class=""> bb = (Mat_SeqAIJ*)(mat->B)->data;<br class=""> ai = aa->i; aj = aa->j;<br class=""> bi = bb->i; bj = bb->j;</div><div class=""><br class=""></div><div class="">The code seems to check for number of processors and if it is = 1 conclude that the matrix is a Mat_SeqAIJ and perform some operations. Only if number-of-procs > 1 then it assumes that matrix is of type Mat_MPIAIJ. I think this is unwieldy and lacks generality. One would like the same piece of code to run in MPI mode for all processors with type Mat_MPIAIJ. Also this restriction has suddenly appeared in a recent version. The issue was not there until at least 3.9.4. So my question is from now (e.g. v12.4) on, should we always use matrix type Mat_SeqAIJ when running on 1 processor even with MPI enabled and use Mat_MPIAIJ for more than 1 processor. That is use the number of processors in use as a criterion to set the matrix type ?</div><div class=""><br class=""></div><div class="">A an illustration, I have attached a minor modification of KSP example 12, that used to work with all PETSc versions until at least 3.9.4 but now throws a segmentation fault. It was compiled with MPI and run with mpiexec -n 1 ./ex12</div><div class="">If I remove the "ierr = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);" it is okay.</div><div class=""><br class=""></div><div class="">I hope you can clarify my confusion.</div><div class=""><br class=""></div><div class="">Regards,</div><div class="">Rochan<br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div></div>
</blockquote></div>
</blockquote></div>
</blockquote></div>
</div></blockquote></div><br class=""></div></body></html>