[petsc-users] petsc on windows
Matthew Knepley
knepley at gmail.com
Thu Aug 29 14:38:47 CDT 2019
On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Dear PETSc dev team,
> I am looking some tips porting petsc to windows. We have our mpi
> wrapper (so we can switch different mpi). I configure petsc using
> --with-mpi-lib and --with-mpi-include
> ./configure --with-cc="win32fe cl" --with-fc=0 --download-f2cblaslapack
> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include
> --with-shared-libaries=1
>
> But I got error
>
> ===============================================================================
> Configuring PETSc to compile on your system
>
> ===============================================================================
> TESTING: check from
> config.libraries(config/BuildSystem/config/libraries.py:154)
> *******************************************************************************
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for
> details):
>
> -------------------------------------------------------------------------------
> --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']
> and
> --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include'] did
> not work
>
> *******************************************************************************
>
Your MPI wrapper should pass the tests here. Send the configure.log
> To fix the configuration error, in config/BuildSystem/config/package.py,
> I removed
> self.executeTest(self.checkDependencies)
> self.executeTest(self.configureLibrary)
> self.executeTest(self.checkSharedLibrary)
>
> To link, I add my mpi wrapper
> to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:
> PCC_LINKER_FLAGS = -MD -wd4996 -Z7
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib
>
> I got libpetstc.dll and libpetstc.lib. When I try to test it inside our
> code, PETSc somehow crates a duplicate of communicator with only 1 MPI
> process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1
> (our MPI_COMM_WORLD), PETSc is hanging.
>
We do dup the communicator on entry. Shouldn't that be supported by your
wrapper?
Thanks,
Matt
> I am wondering if you could give me some tips how to debug this problem.
>
> BR,
> Sam
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190829/9ae88779/attachment.html>
More information about the petsc-users
mailing list