[petsc-users] MatPermute for MPIBAIJ matrix Error
Rongliang Chen
rl.chen at siat.ac.cn
Fri Jun 17 22:49:30 CDT 2016
Dear Barry,
Many thanks for your reply.
In fact, we want to permute a sequential matrix which is a BAIJ matrix
(the sub matrix of the ASM) to reduce the bandwidth of the sub-matrix
and solve this sub-problem in parallel on a many-core processor. But we
found that the MatPermute does not support the sequential BAIJ matrix,
and it has a version for the parallel BAIJ matrix. So we try to use this
to reduce the bandwidth. Do you have any suggestions for us to reduce
the bandwidth of the sub-matrix? Thanks.
Best regards,
Rongliang
On 06/18/2016 09:51 AM, Barry Smith wrote:
> Uggh, this is a horrible mess. In short we don't support permutations in parallel. Even for MPIAIJ matrices it is bogus in that it just independently provides a reordering for the matrix on each process. This kind of parallel graph processing is something that is really weak in PETSc.
>
> Why do you want to permute a parallel matrix? We always recommend that one partition the mesh data structure for good performance before ever calling the PETSc routines. In this way the linear algebra is already well partitioned and doesn't need permutations.
>
> Barry
>
>
>
>> On Jun 16, 2016, at 9:20 PM, Rongliang Chen <rl.chen at siat.ac.cn> wrote:
>>
>> Dear All,
>>
>> I failed to use the MatPermute to permute a MPIBAIJ matrix. The error message is followed, which was obtained by run the example petsc-3.6.3/src/ksp/ksp/examples/tutorials/ex18.c with options: mpirun -n 3 ./ex18 -m 39 -n 18 -ksp_monitor_short -permute nd -mat_type mpibaij -mat_block_size 1 -ksp_view
>>
>> Can anyone tell me how to permute a MPIBAIJ matrix? Thanks.
>>
>> ====================================================================================
>> [2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [2]PETSC ERROR: Nonconforming object sizes
>> [2]PETSC ERROR: Local column sizes 702 do not add up to total number of columns 234
>> [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>> [2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
>> [2]PETSC ERROR: ./ex18 on a 64bit-debug named rlchen by rlchen Fri Jun 17 10:10:49 2016
>> [2]PETSC ERROR: Configure options --download-blacs --download-scalapack --download-metis --download-parmetis --download-exodusii --download-netcdf --download-hdf5 --with-mpi-dir=/home/rlchen/soft/Program/mpich2-shared --with-debugging=1 --download-fblaslapack --with-64-bit-indices
>> [2]PETSC ERROR: #1 MatGetSubMatrix_MPIBAIJ_Private() line 2261 in /home/rlchen/soft/petsc-3.6.3/src/mat/impls/baij/mpi/mpibaij.c
>> [2]PETSC ERROR: #2 MatPermute_MPIBAIJ() line 2358 in /home/rlchen/soft/petsc-3.6.3/src/mat/impls/baij/mpi/mpibaij.c
>> [2]PETSC ERROR: #3 MatPermute() line 4759 in /home/rlchen/soft/petsc-3.6.3/src/mat/interface/matrix.c
>> [2]PETSC ERROR: #4 main() line 171 in /home/rlchen/soft/petsc-3.6.3/src/ksp/ksp/examples/tutorials/ex18.c
>> [2]PETSC ERROR: PETSc Option Table entries:
>> [2]PETSC ERROR: -ksp_monitor_short
>> [2]PETSC ERROR: -ksp_view
>> [2]PETSC ERROR: -m 39
>> [2]PETSC ERROR: -mat_block_size 1
>> [2]PETSC ERROR: -mat_type mpibaij
>> [2]PETSC ERROR: -n 18
>> [2]PETSC ERROR: -permute nd
>> [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
>> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 2
>>
>> =====================================================================================
>> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>> = EXIT CODE: 15360
>> = CLEANING UP REMAINING PROCESSES
>> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>> =====================================================================================
>>
>> Best regards,
>> Rongliang
>>
>
More information about the petsc-users
mailing list