[petsc-users] Code break when PETSc not debugging

Manuel Valera mvalera-w at sdsu.edu
Tue Aug 14 16:09:01 CDT 2018


Thanks Jed,

I got the attached, it looks is coming from one of my routines
CorrectU4Pressure.F90, what other information can i get from this log?

Thanks,

On Tue, Aug 14, 2018 at 12:57 PM, Jed Brown <jed at jedbrown.org> wrote:

> Manuel Valera <mvalera-w at sdsu.edu> writes:
>
> > Hello everyone,
> >
> > I am working on running part of my code in a GPU, recently i was able to
> > run the whole model using one P100 GPU and one processor with good timing
> > but using  --with-debugging=1 as configure argument,
> >
> > With this in mind i compiled PETSc in a separate folder with the same
> exact
> > flags except for --with-debugging=no instead to do some profiling, but
> this
> > was enough to give segfault as an error after running the code, it looks
> > the error happens just after solving the linear system,
> >
> > Any idea on why this may be happening?
>
> Run in a debugger and send a stack trace.
>
> > My configure options:
> >
> >  ./configure PETSC_ARCH=cuda  --with-mpi-dir=/usr/lib64/openmpi
> > --COPTFLAGS='-O2' --CXXOPTFLAGS='-O2' --FOPTFLAGS='-O2'
> > --with-shared-libraries=1 --with-debugging=no --with-cuda=1
> > --CUDAFLAGS=-arch=sm_60  --with-blaslapack-dir=/usr/lib64
> > --download-viennacl
> >
> > My running arguments:
> >
> > mpirun -n 1 ./gcmLEP tc=TestCases/LockRelease/LE_401x6x101/
> jid=cuda_dt0.1
> > -dm_vec_type viennacl -dm_mat_type aijviennacl -pc_type saviennacl
> > -log_view
> >
> > Thanks.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180814/ac3745d1/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: debug02.log
Type: text/x-log
Size: 2587 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180814/ac3745d1/attachment-0001.bin>


More information about the petsc-users mailing list