[petsc-users] ksp for AX=B system
Satish Balay
balay at mcs.anl.gov
Wed Apr 17 19:32:51 CDT 2013
There is no --with-CMake-dir option. If its in your PATH it should be picked up.
Or you can use --with-cmake=/path/to/cmake-binary
Or use --download-cmake
Also suggest using --download-scalapack
If you are still having trouble installing petsc - send the relavent
logs to petsc-maint
Satish
On Wed, 17 Apr 2013, Jin, Shuangshuang wrote:
> Ok, I spent a long time trying to install SuperLU_DIST and MUMPS, but so far with no luck. It took a long time for every ./configure I made. And it kept on complaining about "UNABLE to CONFIGURE with GIVEN OPTIONS" each time for this and that package.
>
> And the latest one I got was:
>
> [d3m956 at olympus petsc-dev.4.17.13]$ ./configure --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --with-scalapack-dir=/share/apps/modules/Modules/3.2.8/modulefiles/development/tools/scalapack/ --download-parmetis --download-metis --with-CMake-dir=/share/apps/modules/Modules/3.2.8/modulefiles/development/tools/cmake
> ===============================================================================
> Configuring PETSc to compile on your system
> ===============================================================================
> TESTING: configureLibrary from PETSc.packages.metis(config/BuildSystem/config/package.py:464) *******************************************************************************
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
> -------------------------------------------------------------------------------
> CMake > 2.5 is needed to build METIS
> *******************************************************************************
>
> Can you see anything wrong with my configuration command?
>
> Thanks,
> Shuangshuang
>
>
> -----Original Message-----
> From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Barry Smith
> Sent: Wednesday, April 17, 2013 2:49 PM
> To: PETSc users list
> Subject: Re: [petsc-users] ksp for AX=B system
>
>
> On Apr 17, 2013, at 1:53 PM, "Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> wrote:
>
> > Hello, Barry, I got a new question here.
> >
> > I have integrated the AX=B solver into my code using the example from ex125.c. The only major difference between my code and the example is the input matrix.
> >
> > In the example ex125.c, the A matrix is created by loading the data from a binary file. In my code, I passed in the computed A matrix and the B matrix as arguments to a user defined solvingAXB() function:
> >
> > static PetscErrorCode solvingAXB(Mat A, Mat B, PetscInt n, PetscInt
> > nrhs, Mat X, const int me);
> >
> > I noticed that A has to be stored as an MATAIJ matrix and the B as a MATDENSE matrix. So I converted their storage types inside the solvingAXB as below (They were MPIAIJ type before):
>
> Note that MATAIJ is MATMPIAIJ when rank > 1 and MATSEQAIJ when rank == 1.
>
> >
> > ierr = MatConvert(A, MATAIJ, MAT_REUSE_MATRIX, &A); // A has to be MATAIJ for built-in PETSc LU!
> > MatView(A, PETSC_VIEWER_STDOUT_WORLD);
> >
> > ierr = MatConvert(B, MATDENSE, MAT_REUSE_MATRIX, &B); // B has to be a SeqDense matrix!
> > MatView(B, PETSC_VIEWER_STDOUT_WORLD);
> >
> > With this implementation, I can run the code with expected results when 1 processor is used.
> >
> > However, when I use 2 processors, I get a run time error which I guess is still somehow related to the matrix format but cannot figure out how to fix it. Could you please take a look at the following error message?
>
> PETSc doesn't have a parallel LU solver, you need to install PETSc with SuperLU_DIST or MUMPS (preferably both) and then pass in one of those names when you call MatGetFactor().
>
> Barry
>
> >
> > [d3m956 at olympus ss08]$ mpiexec -n 2 dynSim -i 3g9b.txt Matrix Object:
> > 1 MPI processes
> > type: mpiaij
> > row 0: (0, 0 - 16.4474 i) (3, 0 + 17.3611 i) row 1: (1, 0 - 8.34725 i)
> > (7, 0 + 16 i) row 2: (2, 0 - 5.51572 i) (5, 0 + 17.0648 i) row 3: (0,
> > 0 + 17.3611 i) (3, 0 - 39.9954 i) (4, 0 + 10.8696 i) (8, 0 + 11.7647
> > i) row 4: (3, 0 + 10.8696 i) (4, 0.934 - 17.0633 i) (5, 0 + 5.88235 i)
> > row 5: (2, 0 + 17.0648 i) (4, 0 + 5.88235 i) (5, 0 - 32.8678 i) (6, 0
> > + 9.92063 i) row 6: (5, 0 + 9.92063 i) (6, 1.03854 - 24.173 i) (7, 0 +
> > 13.8889 i) row 7: (1, 0 + 16 i) (6, 0 + 13.8889 i) (7, 0 - 36.1001 i)
> > (8, 0 + 6.21118 i) row 8: (3, 0 + 11.7647 i) (7, 0 + 6.21118 i) (8,
> > 1.33901 - 18.5115 i) Matrix Object: 1 MPI processes
> > type: mpidense
> > 0.0000000000000000e+00 + -1.6447368421052630e+01i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + -8.3472454090150254e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + -5.5157198014340878e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i
> > 0.0000000000000000e+00 + 0.0000000000000000e+00i 0.0000000000000000e+00 + 0.0000000000000000e+00i PETSC LU:
> > [0]PETSC ERROR: --------------------- Error Message
> > ------------------------------------
> > [0]PETSC ERROR: No support for this operation for this object type!
> > [0]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc LU!
> > [0]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [0]PETSC ERROR: Petsc Development HG revision:
> > 6e0ddc6e9b6d8a9d8eb4c0ede0105827a6b58dfb HG Date: Mon Mar 11 22:54:30
> > 2013 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> > [0]PETSC ERROR: See docs/index.html for manual pages.
> > [0]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [0]PETSC ERROR: dynSim on a arch-complex named olympus.local by
> > d3m956 Wed Apr 17 11:38:12 2013 [0]PETSC ERROR: Libraries linked from
> > /pic/projects/ds/petsc-dev/arch-complex/lib
> > [0]PETSC ERROR: Configure run at Tue Mar 12 14:32:37 2013 [0]PETSC
> > ERROR: Configure options --with-scalar-type=complex
> > --with-clanguage=C++ PETSC_ARCH=arch-complex
> > --with-fortran-kernels=generic [0]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [0]PETSC ERROR: MatGetFactor() line 3949 in
> > src/mat/interface/matrix.c [0]PETSC ERROR: solvingAXB() line 880 in
> > "unknowndirectory/"reducedYBus.C End!
> > [1]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [1]PETSC ERROR: Caught signal number 15 Terminate: Somet process
> > (or the batch system) has told this process to end [1]PETSC ERROR: Try
> > option -start_in_debugger or -on_error_attach_debugger [1]PETSC ERROR:
> > or see
> > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC
> > ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
> > find memory corruption errors [1]PETSC ERROR: likely location of
> > problem given in stack below [1]PETSC ERROR: ---------------------
> > Stack Frames ------------------------------------
> > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> > [1]PETSC ERROR: INSTEAD the line number of the start of the function
> > [1]PETSC ERROR: is given.
> > [1]PETSC ERROR: [1] PetscSleep line 35 src/sys/utils/psleep.c [1]PETSC
> > ERROR: [1] PetscTraceBackErrorHandler line 172
> > src/sys/error/errtrace.c [1]PETSC ERROR: [1] PetscError line 361
> > src/sys/error/err.c [1]PETSC ERROR: [1] MatGetFactor line 3932
> > src/mat/interface/matrix.c [1]PETSC ERROR: --------------------- Error
> > Message ------------------------------------
> > [1]PETSC ERROR: Signal received!
> > [1]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [1]PETSC ERROR: Petsc Development HG revision:
> > 6e0ddc6e9b6d8a9d8eb4c0ede0105827a6b58dfb HG Date: Mon Mar 11 22:54:30
> > 2013 -0500 [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> > [1]PETSC ERROR: See docs/index.html for manual pages.
> > [1]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [1]PETSC ERROR: dynSim on a arch-complex named olympus.local by
> > d3m956 Wed Apr 17 11:38:12 2013 [1]PETSC ERROR: Libraries linked from
> > /pic/projects/ds/petsc-dev/arch-complex/lib
> > [1]PETSC ERROR: Configure run at Tue Mar 12 14:32:37 2013 [1]PETSC
> > ERROR: Configure options --with-scalar-type=complex
> > --with-clanguage=C++ PETSC_ARCH=arch-complex
> > --with-fortran-kernels=generic [1]PETSC ERROR:
> > ----------------------------------------------------------------------
> > -- [1]PETSC ERROR: User provided function() line 0 in unknown
> > directory unknown file
> > ----------------------------------------------------------------------
> > ----
> >
> > Thanks,
> > Shuangshuang
> >
> >
> > -----Original Message-----
> > From: petsc-users-bounces at mcs.anl.gov
> > [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Barry Smith
> > Sent: Tuesday, April 16, 2013 5:50 PM
> > To: PETSc users list
> > Subject: Re: [petsc-users] ksp for AX=B system
> >
> >
> > Shuangshuang
> >
> > This is what I was expecting, thanks for the confirmation. For these size problems you definitely want to use a direct solver (often parallel but not for smaller matrices) and solve multiple right hand sides. This means you actually will not use the KSP solver that is standard for most PETSc work, instead you will work directly with the MatGetFactor(), MatGetOrdering(), MatLUFactorSymbolic(), MatLUFactorNumeric(), MatMatSolve() paradigm where the A matrix is stored as an MATAIJ matrix and the B (multiple right hand side) as a MATDENSE matrix.
> >
> > An example that displays this paradigm is
> > src/mat/examples/tests/ex125.c
> >
> > Once you have something running of interest to you we would like to work with you to improve the performance, we have some "tricks" we haven't yet implemented to make these solvers much faster than they will be by default.
> >
> > Barry
> >
> >
> >
> > On Apr 16, 2013, at 7:38 PM, "Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> wrote:
> >
> >> Hi, Barry, thanks for your prompt reply.
> >>
> >> We have various size problems, from (n=9, m=3), (n=1081, m = 288), to (n=16072, m=2361) or even larger ultimately.
> >>
> >> Usually the dimension of the square matrix A, n is much larger than the column dimension of B, m.
> >>
> >> As you said, I'm using the loop to deal with the small (n=9, m=3) case. However, for bigger problem, I do hope there's a better approach.
> >>
> >> This is a power flow problem. When we parallelized it in OpenMP previously, we just parallelized the outside loop, and used a direct solver to solve it.
> >>
> >> We are now switching to MPI and like to use PETSc ksp solver to solve it in parallel. However, I don't know what's the best ksp solver I should use here? Direct solver or try a preconditioner?
> >>
> >> I appreciate very much for your recommendations.
> >>
> >> Thanks,
> >> Shuangshuang
> >>
> >>
>
>
More information about the petsc-users
mailing list