[petsc-users] Solving complex linear sparse matrix in parallel + external library
Sal Am
tempohoper at gmail.com
Wed Nov 28 09:26:33 CST 2018
Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems to
work.
Now I am wondering about the other problem namely getting the residual, is
the residual only computed when using iterative solvers? Cause using
richardson or gmres with 1 iteration while using MUMPS prints me nothing.
On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Nov 27, 2018 at 4:29 AM Sal Am <tempohoper at gmail.com> wrote:
>
>> This can happen if you use an 'mpiexec' which is from a different MPI
>>> than the one you compiled PETSc with.
>>>
>>
>> that is odd, I tried removing the --download-mpich from the config and
>> tried --with-mpi=1 (which should be default anyways) and retried it with
>> --with-mpich=1.
>>
>
> Your MPI is still broken. Some default MPIs, like those installed with
> Apple, are broken. If you are on Apple
> I would recommend using --download-mpich, but make sure you use
> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec.
>
> If you reconfigure, you either have to specify a different --PETSC_ARCH,
> or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad
> interaction with the libraries already sitting there.
>
> Thanks,
>
> Matt
>
>
>> Current reconfig file:
>> '--download-mpich',
>> '--download-mumps',
>> '--download-scalapack',
>> '--download-superlu_dist',
>> '--with-cc=gcc',
>> '--with-clanguage=cxx',
>> '--with-cxx=g++',
>> '--with-debugging=no',
>> '--with-fc=gfortran',
>> '--with-mpi=1',
>> '--with-mpich=1',
>> '--with-scalar-type=complex',
>> 'PETSC_ARCH=linux-opt'
>> Still does not work though, I tried executing ex11 in
>> ksp/ksp/example/tutorial/ex11 which should solve a linear system in
>> parallel by running mpiexec -n 2 but it prints out
>> Mat Object: 1 MPI processes
>> ...
>> ..
>> twice.
>> What am I missing?
>>
>> Why would you want a minimum iterations? Why not set a tolerance and a
>>> max? What would you achieve with a minimum?
>>>
>>
>> I thought PETSc might not be iterating far enough, but after having had a
>> look at KSPSetTolerances it makes more sense.
>>
>> Instead of 'preonly', where you do not want a residual, use 'richardson'
>>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1)
>>>
>>
>> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson
>> -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1
>> on the command line. However, nothing prints out on the terminal (aside
>> from PetscPrintf if I have them enabled).
>>
>> Thank you.
>>
>> On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>>
>>>> Hi,
>>>>
>>>> I have a few questions:
>>>>
>>>> 1. The following issue/misunderstanding:
>>>> My code reads in two files one PETSc vector and one PETSc matrix (b and
>>>> A from Ax=b, size ~65000x65000).
>>>> and then calls KSP solver to solve it by running the following in the
>>>> terminal:
>>>>
>>>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu
>>>> -pc_factor_mat_solver mumps
>>>>
>>>> Now mumps is supposed to work in parallel and complex, but the code is
>>>> not solved in parallel it seems. It just prints the result twice. Adding
>>>> -log_view gives me
>>>>
>>>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed
>>>> twice.
>>>>
>>>
>>> This can happen if you use an 'mpiexec' which is from a different MPI
>>> than the one you compiled PETSc with.
>>>
>>>
>>>> 2. Using iterative solvers, I am having difficulty getting convergence.
>>>> I found that there is a way to set the maximum number of iterations, but is
>>>> there a minimum I can increase?
>>>>
>>>
>>> Why would you want a minimum iterations? Why not set a tolerance and a
>>> max? What would you achieve with a minimum?
>>>
>>>
>>>> 3. The residual is not computed when using direct external solvers.
>>>> What is proper PETSc way of doing this?
>>>>
>>>
>>> Instead of 'preonly', where you do not want a residual, use 'richardson'
>>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1)
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>> -----------------------------------------------------The
>>>> code---------------------------------------
>>>> #include <petscksp.h>
>>>> #include <iostream>
>>>> int main(int argc,char **args)
>>>> {
>>>> Vec x,b; /* approx solution, RHS */
>>>> Mat A; /* linear system matrix */
>>>> KSP ksp; /* linear solver context */
>>>> PetscReal norm; /* norm of solution error */
>>>> PC pc;
>>>> PetscMPIInt rank, size;
>>>> PetscViewer viewer;
>>>> PetscInt its, i;
>>>> PetscErrorCode ierr;
>>>> PetscScalar *xa;
>>>> PetscBool flg = PETSC_FALSE;
>>>>
>>>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return
>>>> ierr;
>>>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank);
>>>> MPI_Comm_size(PETSC_COMM_WORLD,&size);
>>>>
>>>> #if !defined(PETSC_USE_COMPLEX)
>>>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers");
>>>> #endif
>>>> /*
>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> Compute the matrix and right-hand-side vector that define
>>>> the linear system, Ax = b.
>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> -
>>>> */
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from
>>>> Vector_b.dat ...\n");CHKERRQ(ierr);
>>>> ierr =
>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);
>>>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr);
>>>> ierr = VecLoad(b,viewer); CHKERRQ(ierr);
>>>>
>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from
>>>> Matrix_A.dat ...\n");CHKERRQ(ierr);
>>>> ierr =
>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);
>>>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);
>>>> ierr = MatLoad(A,viewer);CHKERRQ(ierr);
>>>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
>>>>
>>>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr);
>>>>
>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> Create the linear solver and set various options
>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> -
>>>> */
>>>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n");
>>>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);
>>>>
>>>>
>>>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n");
>>>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr);
>>>>
>>>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr);
>>>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr);
>>>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);
>>>>
>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> Solve the linear system
>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>>>> - */
>>>> ierr = KSPSetUp(ksp);CHKERRQ(ierr);
>>>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr);
>>>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr);
>>>> PetscPrintf(PETSC_COMM_WORLD, "Solved");
>>>>
>>>> /*
>>>> Free work space. All PETSc objects should be destroyed when they
>>>> are no longer needed.
>>>> */
>>>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr);
>>>> ierr = VecDestroy(&x);CHKERRQ(ierr);
>>>> ierr = VecDestroy(&b);CHKERRQ(ierr);
>>>> ierr = MatDestroy(&A);CHKERRQ(ierr);
>>>> ierr = PetscFinalize();
>>>> return ierr;
>>>> }
>>>>
>>>>
>>>> Kind regards,
>>>> Sal
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181128/578f624d/attachment.html>
More information about the petsc-users
mailing list