[petsc-dev] [petsc-maint #61515] data size of PETSc-dev GPU computing
li.luo at siat.ac.cn
li.luo at siat.ac.cn
Mon Jan 17 22:49:26 CST 2011
Hi, Barry.
I have tested several example, with the same problem.
The attachments are result for ex2 in \ksp\examples\tutorials,
4GPU-4CPU, different grid size: 128*128, 256*256; with or without CUDA.
You may see the view of KSP, summary log, and database options line.
The result is,
for 128*128, both with an without CUDA, converge.
for 256*256, only without CUDA converge.
Thanks for your help
Regards,
Li Luo
> -----原始邮件-----
> 发件人: "Barry Smith" <bsmith at mcs.anl.gov>
> 发送时间: 2011年1月18日 星期二
> 收件人: petsc-maint <petsc-maint at mcs.anl.gov>, li.luo at siat.ac.cn, "For users of the development version of PETSc" <petsc-dev at mcs.anl.gov>
> 抄送:
> 主题: Re: [petsc-maint #61515] data size of PETSc-dev GPU computing
>
>
> Hmm, could be a bug, could be the algorithm. Run with -ksp_view and send the output
>
> What problem are you solving, is it a PETSc example?
>
> Barry
>
> On Jan 17, 2011, at 8:31 PM, li.luo at siat.ac.cn wrote:
>
> > Hi,
> >
> >
> > I met a problem when testing some examples on PETSc-dev for GPU computing,
> > that is, if one proc pair(1GPU-1CPU) is used, the grid size can be enlarged to even 2048*2048, the memory limitation;
> > however, if more than one proc pairs are use, for example 4GPU-4CPU, the grid size is limited to about 200*200, if larger, the ksp solver would not converge. The same problem happens to 8GPU-8CPU limited by 500*500 or other size.
> >
> > I wonder whether you have the same problem? Any error happens to type MPICUDA?
> >
> > Regards,
> > Li Luo
> >
> >
> >>> # Machine type:
> >>> CPU: Intel(R) Xeon(R) CPU E5520
> >>> GPU: Tesla T10
> >>> CUDA Driver Version: 3.20
> >>> CUDA Capability Major/Minor version number: 1.3
> >>>
> >>> # OS Version:
> >>> Linux console 2.6.18-128.el5 #1 SMP Wed Dec 17 11:41:38 EST 2008 x86_64 x86_64 x86_64 GNU/Linux
> >>>
> >>> # PETSc Version:
> >>> #define PETSC_VERSION_RELEASE 0
> >>> #define PETSC_VERSION_MAJOR 3
> >>> #define PETSC_VERSION_MINOR 1
> >>> #define PETSC_VERSION_SUBMINOR 0
> >>> #define PETSC_VERSION_PATCH 6
> >>> #define PETSC_VERSION_DATE "Mar, 25, 2010"
> >>> #define PETSC_VERSION_PATCH_DATE "Thu Dec 9 00:02:47 CST 2010"
> >>>
> >>>
> >>> # MPI implementation:
> >>> ictce3.2/impi/3.2.0.011/
> >>>
> >>> # Compiler:
> >>>
> >>>
> >>> # Probable PETSc component:
> >>> run with GPU
> >>> # Configure
> >>> ./config/configure.py --download-f-blas-lapack=1 --with-mpi-dir=/bwfs/software/mpich2-1.2.1p1 --with-shared-libraries=0 --with-debugging=no --with-cuda-dir=/bwfs/home/liluo/cuda3.2_64 --with-thrust-dir=/bwfs/home/liluo/cuda3.2_64/include/thrust --with-cusp-dir=/bwfs/home/liluo/cuda3.2_64/include/cusp-library
> >
> >
> >
> >
> >
> >
>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ksp_poisson2d_128_4_with_cuda_view.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110118/0e54eac0/attachment.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ksp_poisson2d_128_4_without_cuda_view.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110118/0e54eac0/attachment-0001.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ksp_poisson2d_256_4_with_cuda_view.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110118/0e54eac0/attachment-0002.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ksp_poisson2d_256_4_without_cuda_view.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110118/0e54eac0/attachment-0003.txt>
More information about the petsc-dev
mailing list