[petsc-users] Error using Metis with PETSc installed with MUMPS

Pierre Jolivet pierre at joliv.et
Thu Nov 2 01:08:37 CDT 2023


> On 1 Nov 2023, at 8:02 PM, Barry Smith <bsmith at petsc.dev> wrote:
> 
> 
>   Pierre,
> 
>    Sorry, I was not clear. What I meant was that the PETSc code that calls MUMPS could change the value of ICNTL(6) under certain conditions before calling MUMPS, thus the MUMPS warning might not be triggered.

Again, I’m not sure it is possible, as the message is not guarded by the value of ICNTL(6), but by some other internal parameters.

Thanks,
Pierre

$ for i in {1..7} do echo "ICNTL(6) = ${i}" ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu -mat_mumps_icntl_4 2 -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2 -mat_mumps_icntl_6 ${i} | grep -i "not allowed" done
ICNTL(6) = 1
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 2
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 3
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 4
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 5
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 6
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
ICNTL(6) = 7
 ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed

> I am basing this on a guess from looking at the MUMPS manual and the warning message that the particular value of ICNTL(6) is incompatible with the given matrix state. But I could easily be wrong.
> 
>   Barry
> 
> 
>> On Nov 1, 2023, at 1:33 PM, Pierre Jolivet <pierre at joliv.et> wrote:
>> 
>> Victoria, please keep the list in copy.
>> 
>>> I am not understanding how can I switch to ParMetis if it does not appear in the options of -mat_mumps_icntl_7.In the options I only have Metis and not ParMetis.
>> 
>> 
>> You need to use -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
>> 
>> Barry, I don’t think we can programmatically shut off this warning, it’s guarded by a bunch of KEEP() values, see src/dana_driver.F:4707, which are only settable/gettable by people with access to consortium releases.
>> I’ll ask the MUMPS people for confirmation.
>> Note that this warning is only printed to screen with the option -mat_mumps_icntl_4 2 (or higher), so this won’t show up for standard runs.
>> 
>> Thanks,
>> Pierre
>> 
>>> On 1 Nov 2023, at 5:52 PM, Barry Smith <bsmith at petsc.dev> wrote:
>>> 
>>> 
>>>   Pierre,
>>> 
>>>    Could the PETSc MUMPS interface "turn-off" ICNTL(6) in this situation so as to not trigger the confusing warning message from MUMPS?
>>> 
>>>   Barry
>>> 
>>>> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet <pierre at joliv.et> wrote:
>>>> 
>>>> 
>>>> 
>>>>> On 1 Nov 2023, at 3:33 PM, Zhang, Hong via petsc-users <petsc-users at mcs.anl.gov> wrote:
>>>>> 
>>>>> Victoria,
>>>>> "** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
>>>>> Ordering based on METIS"
>>>> 
>>>> This warning is benign and appears for every run using a sequential partitioner in MUMPS with a MATMPIAIJ.
>>>> (I’m not saying switching to ParMETIS will not make the issue go away)
>>>> 
>>>> Thanks,
>>>> Pierre
>>>> 
>>>> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu -mat_mumps_icntl_4 2
>>>> Entering DMUMPS 5.6.2 from C interface with JOB, N =   1          56
>>>>       executing #MPI =      2, without OMP
>>>> 
>>>>  =================================================
>>>>  MUMPS compiled with option -Dmetis
>>>>  MUMPS compiled with option -Dparmetis
>>>>  MUMPS compiled with option -Dpord
>>>>  MUMPS compiled with option -Dptscotch
>>>>  MUMPS compiled with option -Dscotch
>>>>  =================================================
>>>> L U Solver for unsymmetric matrices
>>>> Type of parallelism: Working host
>>>> 
>>>>  ****** ANALYSIS STEP ********
>>>> 
>>>>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
>>>>  Processing a graph of size:        56 with           194 edges
>>>>  Ordering based on AMF 
>>>>  WARNING: Largest root node of size        26 not selected for parallel execution
>>>> 
>>>> Leaving analysis phase with  ...
>>>>  INFOG(1)                                       =               0
>>>>  INFOG(2)                                       =               0
>>>> […]
>>>> 
>>>>> Try parmetis.
>>>>> Hong
>>>>> From: petsc-users <petsc-users-bounces at mcs.anl.gov> on behalf of Victoria Rolandi <victoria.rolandi93 at gmail.com>
>>>>> Sent: Tuesday, October 31, 2023 10:30 PM
>>>>> To: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
>>>>> Subject: [petsc-users] Error using Metis with PETSc installed with MUMPS
>>>>>  
>>>>> Hi, 
>>>>> 
>>>>> I'm solving a large sparse linear system in parallel and I am using PETSc with MUMPS. I am trying to test different options, like the ordering of the matrix. Everything works if I use the -mat_mumps_icntl_7 2  or -mat_mumps_icntl_7 0 options (with the first one, AMF, performing better than AMD), however when I test METIS -mat_mumps_icntl_7 5 I get an error (reported at the end of the email).
>>>>> 
>>>>> I have configured PETSc with the following options: 
>>>>> 
>>>>> --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort  --with-scalar-type=complex --with-debugging=0 --with-precision=single --download-mumps --download-scalapack --download-parmetis --download-metis
>>>>> 
>>>>> and the installation didn't give any problems.
>>>>> 
>>>>> Could you help me understand why metis is not working? 
>>>>> 
>>>>> Thank you in advance,
>>>>> Victoria 
>>>>> 
>>>>> Error:
>>>>> 
>>>>>  ****** ANALYSIS STEP ********
>>>>>  ** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed
>>>>>  Processing a graph of size:    699150 with      69238690 edges
>>>>>  Ordering based on METIS
>>>>> 510522 37081376 [100] [10486 699150]
>>>>> Error! Unknown CType: -1
>>>> 
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231102/7c67df4b/attachment-0001.html>


More information about the petsc-users mailing list