[petsc-users] Rectangular matrices turned into square?
Adam Denchfield
adenchfi at hawk.iit.edu
Wed Jul 12 15:11:51 CDT 2017
Hi,
You're absolutely right. It turns out I was not saving my data properly
when running the program in parallel. Thanks for checking!
Regards,
*Adam Denchfield*
*Peer Career Coach - Career Services*
Illinois Institute of Technology
*Bachelors of Science in Applied Physics (2018)*
Email: adenchfi at hawk.iit.edu
My LinkedIn <http://www.linkedin.com/in/adamrobertdenchfield> My
ResearchGate Profile <https://www.researchgate.net/profile/Adam_Denchfield>
On Wed, Jul 12, 2017 at 3:04 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Are you talking about the below? If so, I suspect that the code is
> actually generating a square matrix, not a rectangular one. Hypre
> definitely cannot handle rectangular matrices.
>
> Barry
>
>
>
> PC Object: 2 MPI processes
> type: hypre
> HYPRE BoomerAMG preconditioning
> HYPRE BoomerAMG: Cycle type V
> HYPRE BoomerAMG: Maximum number of levels 25
> HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1
> HYPRE BoomerAMG: Convergence tolerance PER hypre call 0.
> HYPRE BoomerAMG: Threshold for strong coupling 0.25
> HYPRE BoomerAMG: Interpolation truncation factor 0.
> HYPRE BoomerAMG: Interpolation: max elements per row 0
> HYPRE BoomerAMG: Number of levels of aggressive coarsening 0
> HYPRE BoomerAMG: Number of paths for aggressive coarsening 1
> HYPRE BoomerAMG: Maximum row sums 0.9
> HYPRE BoomerAMG: Sweeps down 1
> HYPRE BoomerAMG: Sweeps up 1
> HYPRE BoomerAMG: Sweeps on coarse 1
> HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi
> HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi
> HYPRE BoomerAMG: Relax on coarse Gaussian-elimination
> HYPRE BoomerAMG: Relax weight (all) 1.
> HYPRE BoomerAMG: Outer relax weight (all) 1.
> HYPRE BoomerAMG: Using CF-relaxation
> HYPRE BoomerAMG: Not using more complex smoothers.
> HYPRE BoomerAMG: Measure type local
> HYPRE BoomerAMG: Coarsen type Falgout
> HYPRE BoomerAMG: Interpolation type classical
> linear system matrix = precond matrix:
> Mat Object: 2 MPI processes
> type: mpiaij
> rows=2112, cols=2112, bs=2
> total: nonzeros=29056, allocated nonzeros=29056
> total number of mallocs used during MatSetValues calls =0
> using I-node (on process 0) routines: found 531 nodes, limit used is
> 5
>
> > On Jul 12, 2017, at 2:55 PM, Adam Denchfield <adenchfi at hawk.iit.edu>
> wrote:
> >
> > Hi,
> > Below I've attached the text file with my ksp_view. It's a
> time-dependent problem so I included ten time-steps. The information
> doesn't seem to be much different in the multiple time steps.
> >
> > ______________
> >
> >
> >
> > Regards,
> > Adam Denchfield
> > Peer Career Coach - Career Services
> > Illinois Institute of Technology
> > Bachelors of Science in Applied Physics (2018)
> > Email: adenchfi at hawk.iit.edu
> > My LinkedIn My ResearchGate Profile
> >
> > On Wed, Jul 12, 2017 at 2:48 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > Adam,
> >
> > Please send the output from KSP view. We don't "change" the matrix
> but we might have some intermediate matrix of that other side, or it could
> be a bug in our output.
> >
> > Barry
> >
> > > On Jul 12, 2017, at 2:45 PM, Adam Denchfield <adenchfi at hawk.iit.edu>
> wrote:
> > >
> > > Hi,
> > >
> > > I'm using the FEniCS package, which uses DOLFIN/petsc4py (and thus
> petsc). If I'm understanding the libraries right, I'm assembling
> rectangular matrices and sending them to be solved in a KrylovSolver.
> However, with -ksp_view, I'm seeing 2112x2112 matrices, not 1062x2112
> matrices. Does PETSc do conversions like this automatically during
> preconditioning or something? Trying to identify if this would be a
> behaviour on PETSc's part.
> > >
> > > Regards,
> > > Adam Denchfield
> > > Peer Career Coach - Career Services
> > > Illinois Institute of Technology
> > > Bachelors of Science in Applied Physics (2018)
> > > Email: adenchfi at hawk.iit.edu
> > > My LinkedIn My ResearchGate Profile
> >
> >
> > <output_ksp.txt>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170712/68308430/attachment.html>
More information about the petsc-users
mailing list