[petsc-users] Preconditioning systems of equations with complex numbers
Jed Brown
jed at jedbrown.org
Mon Feb 11 21:26:11 CST 2019
Justin Chang via petsc-users <petsc-users at mcs.anl.gov> writes:
> So I used -mat_view draw -draw_pause -1 on my medium sized matrix and got
> this output:
>
> [image: 1MPI.png]
>
> So it seems there are lots of off-diagonal terms, and that a decomposition
> of the problem via matload would give a terrible unbalanced problem.
>
> Given the initial A and b Mat/Vec, I experimented with MatPartioning and
> inserted the following lines into my code:
>
> Mat Apart;
> Vec bpart;
> MatPartitioning part;
> IS is,isrows;
> ierr = MatPartitioningCreate(PETSC_COMM_WORLD, &part);CHKERRQ(ierr);
> ierr = MatPartitioningSetAdjacency(part, A);CHKERRQ(ierr);
> ierr = MatPartitioningSetFromOptions(part);CHKERRQ(ierr);
> ierr = MatPartitioningApply(part, &is);CHKERRQ(ierr);
> ierr = ISBuildTwoSided(is,NULL,&isrows);CHKERRQ(ierr);
> ierr = MatCreateSubMatrix(A, isrows,isrows, MAT_INITIAL_MATRIX,
> &Apart);CHKERRQ(ierr);
> ierr = MatSetOptionsPrefix(Apart, "part_");CHKERRQ(ierr);
> ierr = MatSetFromOptions(Apart);CHKERRQ(ierr);
> ierr = VecGetSubVector(b,isrows,&bpart);CHKERRQ(ierr);
>
> /* Set Apart and bpart in the KSPSolve */
> ...
>
> And here are the mat_draw figures from 2 and 4 MPI processes respectively:
>
> [image: 2MPI.png][image: 4MPI.png]
>
> Is this "right"? It just feels like I'm just duplicating the nnz structure
> among all the MPI processes. And it didn't really improve the performance
> of ASM.
ASM might not be an appropriate preconditioner (or it might need a
special sort of overlap for stability of the local problems). The edge
cuts look relatively small, so it doesn't look to me like power law or
social network problems that don't admit vertex partitions with low edge
cut.
We really have to understand the spectrum to comment further on fast
solvers.
More information about the petsc-users
mailing list