[petsc-dev] PETSc Error during VecScatterCreate after MUMPS solve
    Jaysaval, Piyoosh 
    Piyoosh.Jaysaval at pnnl.gov
       
    Fri Sep 29 09:57:29 CDT 2023
    
    
  
I think 64-bit indices is the solution. I mentioned earlier that I could solve 1323 RHSs but the DOFs for that problem is about 1.59 million so total unknowns are slightly below 2^31-1.
I did compile PETSc with 64-bit indices but having some other issues with my assignment of variables in the code. I will come back again if the issue persists after fixing the assignments.
Thanks for the help.
Piyoosh
From: Matthew Knepley <knepley at gmail.com>
Date: Thursday, September 28, 2023 at 18:56
To: Jaysaval, Piyoosh <Piyoosh.Jaysaval at pnnl.gov>
Cc: petsc-dev at mcs.anl.gov <petsc-dev at mcs.anl.gov>
Subject: Re: [petsc-dev] PETSc Error during VecScatterCreate after MUMPS solve
On Thu, Sep 28, 2023 at 4:32 PM Jaysaval, Piyoosh <Piyoosh.Jaysaval at pnnl.gov<mailto:Piyoosh.Jaysaval at pnnl.gov>> wrote:
Thanks Matt. I don’t think so; I used the default. I will give it a try again with 64-bit indices.
Do I just need to use --with-64-bit-indices during the configuration?
Yes. I was worried about index overflow, so we will see. We try to protect for it, but its hard to catch every instance.
  Thanks,
     Matt
Piyoosh
Get Outlook for iOS<https://aka.ms/o0ukef>
________________________________
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Sent: Thursday, September 28, 2023 15:24
To: Jaysaval, Piyoosh <Piyoosh.Jaysaval at pnnl.gov<mailto:Piyoosh.Jaysaval at pnnl.gov>>
Cc: petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov> <petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>>
Subject: Re: [petsc-dev] PETSc Error during VecScatterCreate after MUMPS solve
Check twice before you click! This email originated from outside PNNL.
On Thu, Sep 28, 2023 at 4:09 PM Jaysaval, Piyoosh via petsc-dev <petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>> wrote:
Hello PETSc developers,
I am having some issues with using MUMPS solver after the solution phase from PETSc. I am solving a matrix equation with about 2.17 million DOFs and for 1764 RHSs. MUMPS successfully solves the system for all RHS; however, after the solve phase the distributed solution from MUMPS vector/matrix is scattered back to PETSc mpi vector (done within PETSc) and this is where I am getting the error.
FYI, when I use 1323 RHSs, there is no issue. Moreover, I had to use v3.18.6 (or lower) because of some compiling issue with v3.19+ and SuperLU_DIST on our cluster with older intel mpi.
Any help is greatly appreciated. Thanks.
Have you configured with 64-bit indices?
  Thanks,
     Matt
Piyoosh
Here’s is the error message I am getting:
Entering ZMUMPS 5.5.1 from C interface with JOB, N =   3     2165687
      executing #MPI =     63 and #OMP =      1
 ****** SOLVE & CHECK STEP ********
 GLOBAL STATISTICS PRIOR SOLVE PHASE ...........
 Number of right-hand-sides                    =        1764
 Blocking factor for multiple rhs              =          32
 ICNTL (9)                                     =           1
  --- (10)                                     =           0
  --- (11)                                     =           0
  --- (20)                                     =           1
  --- (21)                                     =           1
  --- (30)                                     =           0
  --- (35)                                     =           0
 ** Rank of processor needing largest memory in solve     :         5
 ** Space in MBYTES used by this processor for solve      :      1173
 ** Avg. Space in MBYTES per working proc during solve    :       820
 Leaving solve with ...
 Time to build/scatter RHS        =       0.130768
 Time in solution step (fwd/bwd)  =      41.264175
  .. Time in forward (fwd) step   =         11.261921
  .. Time in ScaLAPACK root       =          3.584752
  .. Time in backward (bwd) step  =         26.515026
 Time to gather solution(cent.sol)=       0.000000
 Time to copy/scale dist. solution=       0.724149
 Elapsed time in solve driver=      42.3461
[23]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[23]PETSC ERROR: Argument out of range
[23]PETSC ERROR: Scatter indices in iy are out of range
[23]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[23]PETSC ERROR: Petsc Release Version 3.18.6, unknown
[23]PETSC ERROR: /people/jays242/softwares/pgemini-em/pgemini/src/pgemini on a arch-linux-intel-opt-v3.18.6 named dc230.local by jays242 Thu Sep 28 10:44:51 2023
[23]PETSC ERROR: Configure options PETSC_ARCH=arch-linux-intel-opt-v3.18.6 --with-debugging=0 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-blaslapack-dir=/share/apps/intel/2020u4/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64 -with-scalar-type=complex --download-mumps --download-metis --with-openmp --download-parmetis --download-superlu_dist --with-scalapack-lib="-L/share/apps/intel/2020u4/compilers_and_libraries_2020.4.304/linux/mkl/lib/intel64 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64"
[23]PETSC ERROR: #1 VecScatterCreate() at /qfs/people/jays242/softwares/pgemini-em/petsc-intel/src/vec/is/sf/interface/vscat.c:736
[23]PETSC ERROR: #2 MatMatSolve_MUMPS() at /qfs/people/jays242/softwares/pgemini-em/petsc-intel/src/mat/impls/aij/mpi/mumps/mumps.c:1449
[23]PETSC ERROR: #3 MatMatTransposeSolve_MUMPS() at /qfs/people/jays242/softwares/pgemini-em/petsc-intel/src/mat/impls/aij/mpi/mumps/mumps.c:1506
[23]PETSC ERROR: #4 MatMatTransposeSolve() at /qfs/people/jays242/softwares/pgemini-em/petsc-intel/src/mat/interface/matrix.c:3789
[24]PETSC ERROR: --------------------- Error Message ---------------
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20230929/2c1578e2/attachment-0001.html>
    
    
More information about the petsc-dev
mailing list