[petsc-users] PetscBinaryRead and quad precision

Matthew Knepley knepley at gmail.com
Sat Jun 27 05:54:06 CDT 2015


On Fri, Jun 26, 2015 at 10:46 PM, Justin Chang <jychang48 at gmail.com> wrote:

> Barry,
>
> Thank you that kind of did the trick, although I have run into a couple
> issues:
>
> 1) Is it expected for quad precision to have a significantly worse
> performance than double precision in terms of wall-clock time? What took 60
> seconds to solve with double precision now takes approximately 1500 seconds
> to solve. I am guessing this has a lot to do with the fact that scalars are
> now 16 bytes large as opposed to 8?
>
> 2) The program crashes at DMPlexDistribute() when I used two or more
> processors. Here's the error log:
>

I am probably not careful to insulate ParMetis from PetscReal. I have to
look at it.

  Thanks,

     Matt


> Input Error: Incorrect sum of 0.000000 for tpwgts for constraint 0.
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Error in external library
> [0]PETSC ERROR: Error in METIS_PartGraphKway()
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.6-815-g2d9afd9  GIT
> Date: 2015-06-26 18:48:28 -0500
> [0]PETSC ERROR: ./main on a arch-linux2-quad-opt named pacotaco-xps by
> justin Fri Jun 26 22:29:36 2015
> [0]PETSC ERROR: Configure options --download-f2cblaslapack
> --download-metis --download-mpich --download-parmetis --download-triangle
> --with-cc=gcc --with-cmake=cmake --with-cxx=g++ --with-debugging=0
> --with-fc=gfortran --with-valgrind=1 PETSC_ARCH=arch-linux2-quad-opt
> --with-precision=__float128
> [0]PETSC ERROR: #1 PetscPartitionerPartition_ParMetis() line 1181 in
> /home/justin/Software/petsc-dev/src/dm/impls/plex/plexpartition.c
> [0]PETSC ERROR: #2 PetscPartitionerPartition() line 653 in
> /home/justin/Software/petsc-dev/src/dm/impls/plex/plexpartition.c
> [0]PETSC ERROR: #3 DMPlexDistribute() line 1505 in
> /home/justin/Software/petsc-dev/src/dm/impls/plex/plexdistribute.c
> [0]PETSC ERROR: #4 CreateMesh() line 762 in
> /home/justin/Dropbox/DMPlex-nonneg/main.c
> [0]PETSC ERROR: #5 main() line 993 in
> /home/justin/Dropbox/DMPlex-nonneg/main.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -al 1
> [0]PETSC ERROR: -am 0
> [0]PETSC ERROR: -at 0.001
> [0]PETSC ERROR: -bcloc
> 0,1,0,1,0,0,0,1,0,1,1,1,0,0,0,1,0,1,1,1,0,1,0,1,0,1,0,0,0,1,0,1,1,1,0,1,0.45,0.55,0.45,0.55,0.45,0.55
> [0]PETSC ERROR: -bcnum 7
> [0]PETSC ERROR: -bcval 0,0,0,0,0,0,1
> [0]PETSC ERROR: -binary_read_double
> [0]PETSC ERROR: -dim 3
> [0]PETSC ERROR: -dm_refine 1
> [0]PETSC ERROR: -dt 0.001
> [0]PETSC ERROR: -edges 3,3
> [0]PETSC ERROR: -floc 0.25,0.75,0.25,0.75,0.25,0.75
> [0]PETSC ERROR: -fnum 0
> [0]PETSC ERROR: -ftime 0,99
> [0]PETSC ERROR: -fval 1
> [0]PETSC ERROR: -ksp_max_it 50000
> [0]PETSC ERROR: -ksp_rtol 1.0e-10
> [0]PETSC ERROR: -ksp_type cg
> [0]PETSC ERROR: -log_summary
> [0]PETSC ERROR: -lower 0,0
> [0]PETSC ERROR: -mat_petscspace_order 0
> [0]PETSC ERROR: -mesh datafiles/cube_with_hole4_mesh.dat
> [0]PETSC ERROR: -mu 1
> [0]PETSC ERROR: -nonneg 1
> [0]PETSC ERROR: -numsteps 0
> [0]PETSC ERROR: -options_left 0
> [0]PETSC ERROR: -pc_type jacobi
> [0]PETSC ERROR: -petscpartitioner_type parmetis
> [0]PETSC ERROR: -progress 1
> [0]PETSC ERROR: -simplex 1
> [0]PETSC ERROR: -solution_petscspace_order 1
> [0]PETSC ERROR: -tao_fatol 1e-8
> [0]PETSC ERROR: -tao_frtol 1e-8
> [0]PETSC ERROR: -tao_max_it 50000
> [0]PETSC ERROR: -tao_monitor
> [0]PETSC ERROR: -tao_type blmvm
> [0]PETSC ERROR: -tao_view
> [0]PETSC ERROR: -trans datafiles/cube_with_hole4_trans.dat
> [0]PETSC ERROR: -upper 1,1
> [0]PETSC ERROR: -vtuname figures/cube_with_hole_4
> [0]PETSC ERROR: -vtuprint 1
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0
> [cli_0]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0
>
>
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   PID 12626 RUNNING AT pacotaco-xps
> =   EXIT CODE: 76
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>
> ===================================================================================
>
> Know what's going on here?
>
> Thanks,
> Justin
>
> On Fri, Jun 26, 2015 at 11:24 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>    We have an undocumented option -binary_read_double that will read in
>> double precision from the file and place it into a quad precision array.
>> This is exactly what you need
>>
>>    Barry
>>
>> Yes, we should fix up our binary viewers to allow  reading and writing
>> generally for any precision but need a volunteer to do it.
>>
>>
>>
>> > On Jun 26, 2015, at 1:45 AM, Justin Chang <jychang48 at gmail.com> wrote:
>> >
>> > Hi all,
>> >
>> > I need to run simulations that rely on several of my custom binary
>> datafiles (written in 32 bit int and 64 bit doubles). These date files were
>> generated from MATLAB. In my PETSc code I invoke PetscBinaryRead(...) into
>> these binary files, which gives me mesh data, auxiliaries, etc.
>> >
>> > However, when I now configure with quad precision
>> (--with-precision=__float128 and --download-f2cblaslapack) my
>> PetscBinaryRead() functions give me segmentation violation errors. I am
>> guessing this is because I have binary files written in double precision
>> but have PETSc which reads in quad precision, meaning I will be reading
>> past the end of these files due to the larger strides.
>> >
>> > So my question is, is there a way to circumvent this issue? That is, to
>> read double-precision binary data into a PETSc program configured with
>> quad-precision? Otherwise I would have to rewrite or redo all of my
>> datafiles, which I would prefer not to do if possible.
>> >
>> > Thanks,
>> > Justin
>> >
>>
>>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150627/39c01513/attachment.html>


More information about the petsc-users mailing list