[petsc-dev] [External] Re: Issue with sacusp preconditioner.

Das, Rajeev rajeev.das at cnl.ca
Wed Mar 23 09:12:22 CDT 2016


UNRESTRICTED  /  ILLIMITÉE

Hi Karl:

I will try to find a way so that you are able to reproduce the error. 

Regards,

Rajeev Das.

-----Original Message-----
From: Karl Rupp [mailto:rupp at iue.tuwien.ac.at] 
Sent: March-23-16 10:08 AM
To: Das, Rajeev; petsc-dev at mcs.anl.gov
Cc: rajeevdas25 at gmail.com
Subject: [External] Re: [petsc-dev] Issue with sacusp preconditioner.

Hi Rajeev,

please use a single support channel only.

Something goes wrong in the second iteration already: num_rows is some 
crazy number. Is there any chance we can reproduce this issue?

Best regards,
Karli

On 03/23/2016 02:39 PM, Das, Rajeev wrote:
> UNRESTRICTED / ILLIMITÉE
>
> Hi:
>
> The scenario that I considered involves a structured grid of size 32 x
> 32 x 32 (using Pflotran here).  When I used the bcgs solver and the
> preconditioner jacobi, there were no issues, but when the preconditioner
> sacusp was used I received the following error:
>
> terminate called after throwing an instance of
> 'thrust::system::system_error'
>
>    what():  function_attributes(): after cudaFuncGetAttributes: an
> illegal memory access was encountered.
>
> I am attaching the screen shots of my debugger (idb) that will provide
> some detail regarding the source of this error. My Petsc configuration is
>
> --download-mpich=yes --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
> --with-shared-libraries=0 --with-debugging=1 --with-valgrind=1
> --download-parmetis=yes --download-metis=yes --download-hypre=yes
> --download-fblaslapack=yes --with-c2html=0 --with-cuda=1 --with-cusp=1
> --with-thrust=1 --with-64-bit-indices=1
>
> I have to use 64 bit indices, if I avoid it, I receive out of memory
> issues. Machines used are Intel Xeon E5-2620 and Tesla K20c card having
> the compute capability of 3.5. When I reduced the grid size to 16 x 16 x
> 16 I no more receive the above error with sacusp.
>
> Can you point me what could be the problem here.
>
> Regards,
>
> Rajeev Das.
>




More information about the petsc-dev mailing list