<div dir="ltr">When I use intel mpi, configuration, compile and test all work fine but I cannot use dll in my application. </div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 29, 2019 at 3:46 PM Sam Guo <<a href="mailto:sam.guo@cd-adapco.com">sam.guo@cd-adapco.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">After I removed following lines inin config/BuildSystem/config/package.py, configuration finished without error. <div><div> self.executeTest(self.checkDependencies)<br> self.executeTest(self.configureLibrary)<br> self.executeTest(self.checkSharedLibrary)</div><div><br></div><div>I then add my mpi wrapper to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:</div><div>PCC_LINKER_FLAGS = -MD -wd4996 -Z7 /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib</div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 29, 2019 at 3:28 PM Balay, Satish <<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Thu, 29 Aug 2019, Sam Guo via petsc-users wrote:<br>
<br>
> I can link when I add my wrapper to<br>
> PCC_LINKER_FLAGS = -MD -wd4996 -Z7<br>
> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib<br>
<br>
I don't understand what you mean here. Add PCC_LINKER_FLAGS to where? This is a variable in configure generated makefile <br>
<br>
Since PETSc is not built [as configure failed] - there should be no configure generated makefiles.<br>
<br>
> (I don't understand why configure does not include my wrapper)<br>
<br>
Well the compiler gives the error below. Can you try to compile<br>
manually [i.e without PETSc or any petsc makefiles] a simple MPI code<br>
- say cpi.c from MPICH and see if it works? [and copy/paste the log<br>
from this compile attempt.<br>
<br>
Satish<br>
<br>
> <br>
> <br>
> On Thu, Aug 29, 2019 at 1:28 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br>
> <br>
> > On Thu, Aug 29, 2019 at 4:02 PM Sam Guo <<a href="mailto:sam.guo@cd-adapco.com" target="_blank">sam.guo@cd-adapco.com</a>> wrote:<br>
> ><br>
> >> Thanks for the quick response. Attached please find the configure.log<br>
> >> containing the configure error.<br>
> >><br>
> ><br>
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl<br>
> > -c -o /tmp/petsc-6DsCEk/config.libraries/conftest.o<br>
> > -I/tmp/petsc-6DsCEk/config.compilers<br>
> > -I/tmp/petsc-6DsCEk/config.setCompilers<br>
> > -I/tmp/petsc-6DsCEk/config.utilities.closure<br>
> > -I/tmp/petsc-6DsCEk/config.headers<br>
> > -I/tmp/petsc-6DsCEk/config.utilities.cacheDetails<br>
> > -I/tmp/petsc-6DsCEk/config.types -I/tmp/petsc-6DsCEk/config.atomics<br>
> > -I/tmp/petsc-6DsCEk/config.functions<br>
> > -I/tmp/petsc-6DsCEk/config.utilities.featureTestMacros<br>
> > -I/tmp/petsc-6DsCEk/config.utilities.missing<br>
> > -I/tmp/petsc-6DsCEk/PETSc.options.scalarTypes<br>
> > -I/tmp/petsc-6DsCEk/config.libraries -MD -wd4996 -Z7<br>
> > /tmp/petsc-6DsCEk/config.libraries/conftest.c<br>
> > stdout: conftest.c<br>
> > Successful compile:<br>
> > Source:<br>
> > #include "confdefs.h"<br>
> > #include "conffix.h"<br>
> > /* Override any gcc2 internal prototype to avoid an error. */<br>
> > char MPI_Init();<br>
> > static void _check_MPI_Init() { MPI_Init(); }<br>
> > char MPI_Comm_create();<br>
> > static void _check_MPI_Comm_create() { MPI_Comm_create(); }<br>
> ><br>
> > int main() {<br>
> > _check_MPI_Init();<br>
> > _check_MPI_Comm_create();;<br>
> > return 0;<br>
> > }<br>
> > Executing: /home/xianzhongg/petsc-3.11.3/lib/petsc/bin/win32fe/win32fe cl<br>
> > -o /tmp/petsc-6DsCEk/config.libraries/conftest.exe -MD -wd4996 -Z7<br>
> > /tmp/petsc-6DsCEk/config.libraries/conftest.o<br>
> > /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib<br>
> > Ws2_32.lib<br>
> > stdout:<br>
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not<br>
> > built by the last incremental link; performing full link<br>
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init<br>
> > referenced in function _check_MPI_Init<br>
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create<br>
> > referenced in function _check_MPI_Comm_create<br>
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:<br>
> > 2 unresolved externals<br>
> > Possible ERROR while running linker: exit code 2<br>
> > stdout:<br>
> > LINK : C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe not found or not<br>
> > built by the last incremental link; performing full link<br>
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Init<br>
> > referenced in function _check_MPI_Init<br>
> > conftest.obj : error LNK2019: unresolved external symbol MPI_Comm_create<br>
> > referenced in function _check_MPI_Comm_create<br>
> > C:\cygwin64\tmp\PE81BA~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120:<br>
> > 2 unresolved externals<br>
> ><br>
> > The link is definitely failing. Does it work if you do it by hand?<br>
> ><br>
> > Thanks,<br>
> ><br>
> > Matt<br>
> ><br>
> ><br>
> >> Regarding our dup, our wrapper does support it. In fact, everything works<br>
> >> fine on Linux. I suspect on windows, PETSc picks the system mpi.h somehow.<br>
> >> I am investigating it.<br>
> >><br>
> >> Thanks,<br>
> >> Sam<br>
> >><br>
> >> On Thu, Aug 29, 2019 at 3:39 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br>
> >> wrote:<br>
> >><br>
> >>> On Thu, Aug 29, 2019 at 3:33 PM Sam Guo via petsc-users <<br>
> >>> <a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:<br>
> >>><br>
> >>>> Dear PETSc dev team,<br>
> >>>> I am looking some tips porting petsc to windows. We have our mpi<br>
> >>>> wrapper (so we can switch different mpi). I configure petsc using<br>
> >>>> --with-mpi-lib and --with-mpi-include<br>
> >>>> ./configure --with-cc="win32fe cl" --with-fc=0<br>
> >>>> --download-f2cblaslapack<br>
> >>>> --with-mpi-lib=/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib<br>
> >>>> --with-mpi-include=/home/xianzhongg/dev/star/base/src/mpi/include<br>
> >>>> --with-shared-libaries=1<br>
> >>>><br>
> >>>> But I got error<br>
> >>>><br>
> >>>> ===============================================================================<br>
> >>>> Configuring PETSc to compile on your system<br>
> >>>><br>
> >>>> ===============================================================================<br>
> >>>> TESTING: check from<br>
> >>>> config.libraries(config/BuildSystem/config/libraries.py:154)<br>
> >>>> *******************************************************************************<br>
> >>>> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log<br>
> >>>> for details):<br>
> >>>><br>
> >>>> -------------------------------------------------------------------------------<br>
> >>>> --with-mpi-lib=['/home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib']<br>
> >>>> and<br>
> >>>> --with-mpi-include=['/home/xianzhongg/dev/star/base/src/mpi/include']<br>
> >>>> did not work<br>
> >>>><br>
> >>>> *******************************************************************************<br>
> >>>><br>
> >>><br>
> >>> Your MPI wrapper should pass the tests here. Send the configure.log<br>
> >>><br>
> >>><br>
> >>>> To fix the configuration error, in<br>
> >>>> config/BuildSystem/config/package.py, I removed<br>
> >>>> self.executeTest(self.checkDependencies)<br>
> >>>> self.executeTest(self.configureLibrary)<br>
> >>>> self.executeTest(self.checkSharedLibrary)<br>
> >>>><br>
> >>>> To link, I add my mpi wrapper<br>
> >>>> to ${PTESTC_ARCH}/lib/petsc/conf/petscvariables:<br>
> >>>> PCC_LINKER_FLAGS = -MD -wd4996 -Z7<br>
> >>>> /home/xianzhongg/dev/star/lib/win64/intel18.3vc14/lib/StarMpiWrapper.lib<br>
> >>>><br>
> >>>> I got libpetstc.dll and libpetstc.lib. When I try to test it inside our<br>
> >>>> code, PETSc somehow crates a duplicate of communicator with only 1 MPI<br>
> >>>> process and PETSC_COMM_WORLD is set to 2. If I set PETSC_COMM_WORLD to 1<br>
> >>>> (our MPI_COMM_WORLD), PETSc is hanging.<br>
> >>>><br>
> >>><br>
> >>> We do dup the communicator on entry. Shouldn't that be supported by your<br>
> >>> wrapper?<br>
> >>><br>
> >>> Thanks,<br>
> >>><br>
> >>> Matt<br>
> >>><br>
> >>><br>
> >>>> I am wondering if you could give me some tips how to debug this problem.<br>
> >>>><br>
> >>>> BR,<br>
> >>>> Sam<br>
> >>>><br>
> >>><br>
> >>><br>
> >>> --<br>
> >>> What most experimenters take for granted before they begin their<br>
> >>> experiments is infinitely more interesting than any results to which their<br>
> >>> experiments lead.<br>
> >>> -- Norbert Wiener<br>
> >>><br>
> >>> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br>
> >>> <<a href="http://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
> >>><br>
> >><br>
> ><br>
> > --<br>
> > What most experimenters take for granted before they begin their<br>
> > experiments is infinitely more interesting than any results to which their<br>
> > experiments lead.<br>
> > -- Norbert Wiener<br>
> ><br>
> > <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br>
> > <<a href="http://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
> ><br>
> <br>
<br>
</blockquote></div>
</blockquote></div>