[petsc-dev] [petsc-maint] incorporate STRUMPACK in PETSc
Pieter Ghysels
pghysels at lbl.gov
Wed May 4 16:27:09 CDT 2016
Hello,
I have created a pull request (#464) for the interface to STRUMPACK-sparse:
https://bitbucket.org/petsc/petsc/pull-requests/464/add-interface-to-strumpack-sparse/diff
I look forward to your feedback.
This pull request adds 2 interfaces to STRUMPACK-sparse: one for aij/seq
and one for aij/mpi matrices. It also adds a script to the buildsystem to
download/compile/install STRUMPACK-sparse. This should work for
real/complex, single/double, 32/64bit with OpenMP+MPI.
I can get everything compiled with ./configure --download-strumpack
--with-cxx-dialect=C++11 \ --with-openmp --download-scalapack
--download-parmetis --download-metis --download-ptscotch (There was an
issue compiling STRUMPACK-sparse as a shared library, so it is now always
static)
I tested the new code with the src/snes/examples/tutorials/ex5.c example:
OMP_NUM_THREADS=2 mpirun -n 2 ./ex5 \
-da_grid_x 100 \
-da_grid_y 100 \
-pc_type lu -pc_factor_mat_solver_package strumpack_mpi \
-ksp_monitor -mat_strumpack_mpi_matinput DISTRIBUTED \
--sp_mc64job 0 --sp_reordering_method scotch --sp_verbose
The aij/mpi solver is called 'strumpack_mpi', the aij/seq solver is
'strumpack'. All command line options are passed to the STRUMPACK solver,
including -h, which prints out the STRUMPACK options.
STRUMPACK-sparse has 3 interfaces:
- A sequential/multi-threaded interface. This one is used with aij/seq
matrices by selecting the 'strumpack' solver.
- A distributed memory solver with replicated interface. This interface
has the same input on every MPI proc (entire matrix and rhs and solution
vectors). Select this for aij/mpi, with "-pc_type lu
-pc_factor_mat_solver_package strumpack_mpi -mat_strumpack_mpi_matinput
GLOBAL"
- A fully distributed solver, works with the aij/mpi block-row
distributed input matrix directly. Use with: "-pc_type lu
-pc_factor_mat_solver_package strumpack_mpi -mat_strumpack_mpi_matinput
DISTRIBUTED"
Pieter Ghysels
On Wed, May 4, 2016 at 12:06 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Ghysels should submit a pull request at bitbucket.com/petsc/petsc
> (sorry we do our development on bitbucket not github) and we'll take a look
> at it, make any suggestions and then merge it in.
>
> Thanks
>
> Barry
>
> We provide details on how to provide the pull request at
> https://bitbucket.org/petsc/petsc/wiki/pull-request-instructions-git
>
> Please follow the style guide at
> http://www.mcs.anl.gov/petsc/developers/developers.pdf
>
>
> > On May 4, 2016, at 1:55 PM, Xiaoye S. Li <xsli at lbl.gov> wrote:
> >
> > Hello,
> > We have a new package STRUMPACK, using low-rank factorization
> techniques, either as a direct solver or preconditioner. Currently it only
> has HSS structure. In the future, we will implement other structured
> formats. Please see the web site (sparse version: STRUMPACK-sparse-1.0.0)
> >
> > http://portal.nersc.gov/project/sparse/strumpack/
> >
> > We would like to incorporate STRUMPACK in PETSc. In particular, the CEMM
> SciDAC team (lead: Steve Jardin) is very interested in using it from PETSc.
> (We tested their matrices in standalone mode, it works very well.)
> >
> > Ghysels already implemented all the necessary hooks. Can he just
> contribute his changes to PETSc's github, then you can merge into next
> release?
> >
> > Thanks,
> > Sherry
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20160504/b1763a17/attachment.html>
More information about the petsc-dev
mailing list