[petsc-users] Fortran example for PETSc global to natural ordering test
Danyang Su
danyang.su at gmail.com
Thu Feb 28 14:33:11 CST 2019
Hi Matt,
I have further questions regarding global to natural ordering using the
latest PETSc-dev version.
1) Is the mesh reordered when DMPlexCreateFromCellList is used? I made
some tests on this but found the natural order was not same as the mesh
I fed DMPlex. Probably I made something wrong using
DMPlexGlobalToNaturalBegin(End)?
2) If I pass null labels to DMPlexCreateSection, the get label cannot
work and the returned value is -1, which is as expected since no labels
are passed. However, if I pass valid labels to DMPlexCreateSection, I
get vector size 1 for all the processors when use DMCreateGlobalVector.
When I use PETSc-3.10.3 or earlier version, labels are set separately
and it works fine.
Attached is the example I use, an updated version of fortran example to
read mesh from external file, distribute over processors and then check
global to natural ordering.
To compile the code, type 'make natural'. Below is the screen output
when run the code using 4 processors. The local node index (node-loc)
and natural node index (node-nat) look fine, but the coordinates do not
match, unfortunately.
+rank 0 node-loc 1 node-petsc 0 node-nat 14 node-label -1
coord-loc 0.000E+00 0.000E+00 0.200E+01 coord-nat 0.100E+01
0.000E+00 0.000E+00
+rank 0 node-loc 2 node-petsc 1 node-nat 30 node-label -1
coord-loc 0.800E+00 0.000E+00 0.120E+01 coord-nat 0.110E+01
0.000E+00 0.800E+00
+rank 0 node-loc 3 node-petsc 2 node-nat 31 node-label -1
coord-loc 0.000E+00 0.000E+00 0.150E+01 coord-nat 0.367E+00
0.000E+00 0.123E+01
+rank 1 node-loc 1 node-petsc 13 node-nat 47 node-label -1
coord-loc 0.000E+00 0.000E+00 0.000E+00 coord-nat 0.197E+01
0.000E+00 0.467E+00
+rank 1 node-loc 2 node-petsc 14 node-nat 48 node-label -1
coord-loc 0.200E+00 0.000E+00 0.800E+00 coord-nat 0.632E+00
0.000E+00 0.949E+00
Is there any chance for you to make a test or point out any mistakes
I've made?
Thanks,
Danyang
On 2018-12-04 11:06 a.m., Matthew Knepley wrote:
> On Mon, Dec 3, 2018 at 8:32 PM Danyang Su <danyang.su at gmail.com
> <mailto:danyang.su at gmail.com>> wrote:
>
> Hi Matt,
>
> Attached is the test example with source code, makefile, data and
> screen output I wrote this afternoon. This example reads 2D mesh
> from vtk file and then distribute over all processors. I can get
> correct global order of local nodes after distribution, but the
> natural order of local nodes is always zero after using
> DMPlexGlobalToNaturalBegin/End(). Sounds like it does not take
> effect.
>
> Would you mind helping me to check/test this code at you most
> convenience?
>
> On 2 procs, I get an SEGV
>
> rank1 local nodes with ghost 47 local cells with ghost 75
>
> rank0 local nodes with ghost 52 local cells with ghost 78
>
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
>
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
>
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>
> [1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
> [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
>
> [1]PETSC ERROR: likely location of problem given in stack below
>
> [1]PETSC ERROR: ---------------------Stack Frames
> ------------------------------------
>
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
>
> [1]PETSC ERROR: INSTEAD the line number of the start of the function
>
> [1]PETSC ERROR: is given.
>
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
>
> [1]PETSC ERROR: Signal received
>
> [1]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>
> [1]PETSC ERROR: Petsc Development GIT revision:
> v3.9.3-1021-g8625415GIT Date: 2018-08-02 12:57:14 -0500
>
> [1]PETSC ERROR: Unknown Name on a arch-master-debug named
> MATTHEW-KNEPLEYs-MacBook-Air-2.local by knepley Tue Dec4 14:00:46 2018
>
> [1]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
> --download-chaco
> --download-cmake=/Users/knepley/Downloads/cmake-3.7.2.tar.gz
> --download-ctetgen --download-eigen --download-fftw --download-hdf5
> --download-med --download-metis --download-mpich --download-netcdf
> --download-p4est --download-parmetis --download-pnetcdf
> --download-superlu_dist --download-triangle
> --with-cc="/Users/knepley/MacSoftware/bin/ccache gcc
> -Qunused-arguments" --with-cxx="/Users/knepley/MacSoftware/bin/ccache
> g++ -Qunused-arguments"
> --with-fc="/Users/knepley/MacSoftware/bin/ccache gfortran"
> --with-shared-libraries
>
> [1]PETSC ERROR: #1 User provided function() line 0 inunknown file
>
> [0]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) -
> process 1
>
> ------------------------------------------------------------------------
>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
>
> [0]PETSC ERROR: likely location of problem given in stack below
>
> [0]PETSC ERROR: ---------------------Stack Frames
> ------------------------------------
>
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
>
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
>
> [0]PETSC ERROR: is given.
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
>
> [0]PETSC ERROR: Signal received
>
> [0]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>
> [0]PETSC ERROR: Petsc Development GIT revision:
> v3.9.3-1021-g8625415GIT Date: 2018-08-02 12:57:14 -0500
>
> [0]PETSC ERROR: Unknown Name on a arch-master-debug named
> MATTHEW-KNEPLEYs-MacBook-Air-2.local by knepley Tue Dec4 14:00:46 2018
>
> [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
> --download-chaco
> --download-cmake=/Users/knepley/Downloads/cmake-3.7.2.tar.gz
> --download-ctetgen --download-eigen --download-fftw --download-hdf5
> --download-med --download-metis --download-mpich --download-netcdf
> --download-p4est --download-parmetis --download-pnetcdf
> --download-superlu_dist --download-triangle
> --with-cc="/Users/knepley/MacSoftware/bin/ccache gcc
> -Qunused-arguments" --with-cxx="/Users/knepley/MacSoftware/bin/ccache
> g++ -Qunused-arguments"
> --with-fc="/Users/knepley/MacSoftware/bin/ccache gfortran"
> --with-shared-libraries
>
> [0]PETSC ERROR: #1 User provided function() line 0 inunknown file
>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>
> which valgrind says comes from DMPlexCreateSection()
>
> rank0 local nodes with ghost 52 local cells with ghost 78
>
> rank1 local nodes with ghost 47 local cells with ghost 75
>
> ==14766== Invalid write of size 4
>
> ==14766==at 0x1151A3B: dmplexcreatesection_ (zplexsectionf90.c:20)
>
> ==14766==by 0x1000042FA: MAIN__ (in ./natural)
>
> ==14766==by 0x10000497D: main (in ./natural)
>
> ==14766==Address 0x600001000 is not stack'd, malloc'd or (recently) free'd
>
> ==14766==
>
> ==14767== Invalid write of size 4
>
> ==14767==at 0x1151A3B: dmplexcreatesection_ (zplexsectionf90.c:20)
>
> ==14767==by 0x1000042FA: MAIN__ (in ./natural)
>
> ==14767==by 0x10000497D: main (in ./natural)
>
> ==14767==Address 0x600001000 is not stack'd, malloc'd or (recently) free'd
>
> ==14767==
>
> I cannot run the debugger on it because command line arguments are not
> working with
> my Fortran compiler (Ugh). Do you see this error?
>
> Thanks,
>
> Matt
>
> Thanks,
>
> Danyang
>
> On 2018-12-03 1:12 p.m., Danyang Su wrote:
>>
>>
>> On 2018-12-03 12:56 p.m., Matthew Knepley wrote:
>>> On Mon, Dec 3, 2018 at 3:40 PM Danyang Su <danyang.su at gmail.com
>>> <mailto:danyang.su at gmail.com>> wrote:
>>>
>>>
>>> On 2018-12-03 12:03 p.m., Matthew Knepley wrote:
>>>> On Mon, Dec 3, 2018 at 2:27 PM Danyang Su
>>>> <danyang.su at gmail.com <mailto:danyang.su at gmail.com>> wrote:
>>>>
>>>> Hi Matt,
>>>>
>>>> Thanks.
>>>>
>>>> BTW: DmPlexGetVertexNumbering now can work using the
>>>> latest develop version. But the index is not in natural
>>>> ordering when DMSetUseNatural is called. That's why I
>>>> want to use PetscSFDistributeSection to check if I miss
>>>> anything in the code.
>>>>
>>>> Can you explain that a little more? Maybe you can just push
>>>> forward what you want using the migrationSF.
>>>
>>> Hi Matt,
>>>
>>> Since I cannot figure what is wrong or missing in my code, I
>>> followed an old ex26.c example in
>>> src/dm/impls/plex/examples/tests to create similar code as
>>> shown below to test global to natural ordering. The code may
>>> be ugly with unnecessary functions in it. Using
>>> DmPlexGetVertexNumbering, I can get the value but it is not
>>> in natural order, instead, it is still in default PETSc
>>> order without calling DMSetUseNatural(dm,PETSC_TRUE,ierr).
>>>
>>> I do not understand what you are doing below. You just need to call
>>>
>>> ierr = DMSetUseNatural(dm,PETSC_TRUE);CHKERRQ(ierr);
>>> ierr = DMPlexDistribute(dm,0,&migrationSF,&pdm);CHKERRQ(ierr);
>>> if (pdm) {
>>> ierr = DMPlexSetMigrationSF(pdm,migrationSF);CHKERRQ(ierr);
>>> }
>>> and the DMGlobalToNaturalBegin/End() should work.
>>
>> You mean to use DMPlexGlobalToNaturalBegin/End(), right? That's
>> what I tried at first, but without success.
>>
>> I will create a test example to make further check if I can
>> reproduce the problem.
>>
>> Thanks,
>>
>> Danyang
>>
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>> if (rank == 0) then
>>>
>>> call
>>> DMPlexCreateFromCellList(Petsc_Comm_World,ndim,num_cells,
>>> num_nodes,num_nodes_per_cell, &
>>> Petsc_False,dmplex_cells,ndim, dmplex_verts,dm,ierr)
>>> CHKERRQ(ierr)
>>> else
>>> call
>>> DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,
>>> 0,num_nodes_per_cell, &
>>> Petsc_False,dmplex_cells,ndim,dmplex_verts,dm,ierr)
>>> CHKERRQ(ierr)
>>> end if
>>>
>>> if (nprocs > 1) then
>>> call DMSetUseNatural(dm,PETSC_TRUE,ierr)
>>> CHKERRQ(ierr)
>>> end if
>>>
>>> call DMPlexDistribute(dm,stencil_width, &
>>> migrationsf,distributedMesh,ierr)
>>> CHKERRQ(ierr)
>>>
>>> if (distributedMesh /= PETSC_NULL_DM) then
>>> call
>>> PetscSFCreateInverseSF(migrationsf,migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMCreateGlobalToNatural(distributedMesh,migrationsf,migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call DMGetSection(distributedMesh,section,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> PetscSectionCreate(Petsc_Comm_World,section_seq,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> PetscSFDistributeSection(migrationsf_inv,section, &
>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMPlexCreateGlobalToNaturalSF(distributedMesh, &
>>> section_seq,migrationsf,sf_natural,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMSetUseNatural(distributedMesh,PETSC_TRUE,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call PetscSFDestroy(migrationsf,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call PetscSFDestroy(migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> end if
>>>
>>> Thanks,
>>>
>>> Danyang
>>>
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>> Regards,
>>>>
>>>> Danyang
>>>>
>>>> On 2018-12-03 5:22 a.m., Matthew Knepley wrote:
>>>>> I need to write a custom Fortran stub for this one. I
>>>>> will get it done as soon as possible.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Matt
>>>>>
>>>>> On Sat, Dec 1, 2018 at 7:16 PM Danyang Su via
>>>>> petsc-users <petsc-users at mcs.anl.gov
>>>>> <mailto:petsc-users at mcs.anl.gov>> wrote:
>>>>>
>>>>> Hi All,
>>>>>
>>>>> I got a simple compilation error when use
>>>>> PetscSFDistributeSection in
>>>>> Fortran. It looks like the required head files are
>>>>> included and the
>>>>> parameters are correctly defined. However, when
>>>>> compile the code, I got
>>>>> error undefined reference to
>>>>> `petscsfdistributesection_'. The code is
>>>>> shown below. Did I miss anything here?
>>>>>
>>>>> #include <petsc/finclude/petscsys.h>
>>>>> #include <petsc/finclude/petscvec.h>
>>>>> #include <petsc/finclude/petscdm.h>
>>>>> #include <petsc/finclude/petscdmplex.h>
>>>>> use petscsys
>>>>> use petscvec
>>>>> use petscdm
>>>>> use petscdmplex
>>>>>
>>>>> implicit none
>>>>>
>>>>> PetscSection :: section, section_seq
>>>>> PetscSF :: migrationsf_inv, sf_natural
>>>>> Vec :: vec_global, vec_natural
>>>>> PetscErrorCode :: ierr
>>>>>
>>>>> ...
>>>>>
>>>>> call
>>>>> PetscSFDistributeSection(migrationsf_inv,section,
>>>>> &
>>>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>>>> CHKERRQ(ierr)
>>>>>
>>>>>
>>>>> call
>>>>> PetscSFDistributeSection(migrationsf_inv,section,
>>>>> &
>>>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>>>> CHKERRQ(ierr)
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Danyang
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they
>>>>> begin their experiments is infinitely more interesting
>>>>> than any results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin
>>>> their experiments is infinitely more interesting than any
>>>> results to which their experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to
>>> which their experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190228/ac758c00/attachment-0001.html>
-------------- next part --------------
#PETSc variables for development version, version V3.6.0 and later
include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules
FFLAGS =
OBJS = ./mesh_data.o\
./natural.o
natural: $(OBJS) chkopts
-${FLINKER} $(FFLAGS) -o natural $(OBJS) ${PETSC_LIB}
${RM} $(SRC)*.o $(SRC)*.mod
-------------- next part --------------
# vtk DataFile Version 2.0
mesh-2d, Created by Gmsh
ASCII
DATASET UNSTRUCTURED_GRID
POINTS 69 double
0 0 0
3 0 0
3 0 2
0 0 2
0.2 0 0.8
0.4 0 0.8
0.8 0 0.8
1.4 0 0.8
2.2 0 0.8
0.8 0 0.6
0.8 0 1
0.8 0 1.2
0.499999999997822 0 0
0.9999999999960245 0 0
1.499999999994285 0 0
1.999999999996158 0 0
2.499999999998079 0 0
0 0 1.500000000000693
0 0 1.000000000004118
0 0 0.5000000000020808
2.500000000003167 0 2
2.000000000001941 0 2
1.500000000004168 0 2
1.00000000000281 0 2
0.5000000000014051 0 2
3 0 0.4999999999988241
3 0 0.999999999997388
3 0 1.499999999998683
1.799999999998915 0 0.8
1.099999999999501 0 0.8
0.3666666666682681 0 1.233333333331731
1.271751539294908 0 1.355799542686996
2.00985325291648 0 1.455472832778138
0.4741111111103676 0 0.3178750000004925
1.034296721683984 0 1.092713954025319
0.6117161716167377 0 1.58564356435608
0.990728609969793 0 1.559942023676578
1.182424242422426 0 0.2962202380952292
2.477293926290137 0 1.244553762494937
0.493402650403436 0 1.055459773692982
0.5909632735517151 0 0.6476526035870375
1.580443401994861 0 0.4189257391804766
1.612090632780561 0 1.126835640176751
2.59236745238114 0 0.4103840906174537
0.7941638542849001 0 0.3147114254793752
0.2731748474417706 0 1.020420503636702
1.966057517364436 0 0.4673319812431365
0.631702026375183 0 0.9486426017917475
0.3558583280051764 0 0.545353879769248
1.614531084604252 0 1.608669168454826
2.604360529456519 0 0.8445703766706929
0.622744504707201 0 1.257570097567692
2.329925884063499 0 1.676684671728572
0.1816715026262627 0 1.240054510318584
1.27739349359455 0 1.715198584350745
0.6030193133904318 0 0.4851185817672306
1.342971052691814 0 1.056932407471012
0.3067601705739227 0 1.536378567870203
0.8018251367042398 0 1.384587091501278
2.019847562397219 0 1.085372447089965
0.9839363972274676 0 1.319779771906041
0.9335741804208713 0 0.9231784885063297
0.981731303510901 0 0.6560746918499005
0.2238250266085547 0 0.2618395980849577
0.6651074814724786 0 1.087753235265755
2.661443962071361 0 1.684247686844439
0.9932946352142333 0 0.4897210062248542
2.249844184832246 0 0.3177341670352057
1.208153775852939 0 0.581598675699766
CELLS 157 575
1 0
1 1
1 2
1 3
1 4
1 5
1 6
1 7
1 8
1 9
1 10
1 11
2 0 12
2 12 13
2 13 14
2 14 15
2 15 16
2 16 1
2 3 17
2 17 18
2 18 19
2 19 0
2 2 20
2 20 21
2 21 22
2 22 23
2 23 24
2 24 3
2 1 25
2 25 26
2 26 27
2 27 2
2 7 28
2 28 8
2 6 29
2 29 7
2 5 6
2 4 5
2 6 9
2 10 6
2 11 10
3 5 48 40
3 35 30 51
3 23 35 36
3 10 34 11
3 40 48 55
3 1 25 43
3 1 43 16
3 10 64 47
3 8 38 59
3 31 42 49
3 21 49 32
3 23 24 35
3 6 10 47
3 4 45 18
3 21 32 52
3 6 29 61
3 6 62 29
3 4 19 48
3 31 60 34
3 7 41 28
3 32 49 42
3 46 67 8
3 32 59 38
3 15 46 41
3 3 57 24
3 39 47 64
3 31 34 56
3 67 43 8
3 5 47 39
3 26 27 38
3 6 40 9
3 14 15 41
3 0 63 19
3 5 39 45
3 8 50 38
3 10 61 34
3 14 41 37
3 7 68 41
3 7 28 42
3 9 55 44
3 35 51 58
3 13 14 37
3 9 44 66
3 4 5 45
3 57 30 35
3 24 57 35
3 32 38 52
3 3 17 57
3 29 56 34
3 25 26 50
3 25 50 43
3 7 56 29
3 35 58 36
3 37 66 44
3 4 48 5
3 26 38 50
3 32 42 59
3 31 36 60
3 27 65 38
3 37 41 68
3 11 34 60
3 33 55 48
3 28 41 46
3 8 28 46
3 20 21 52
3 30 57 53
3 8 43 50
3 21 22 49
3 11 60 58
3 7 42 56
3 62 66 68
3 12 13 44
3 23 36 54
3 9 40 55
3 30 45 39
3 29 34 61
3 19 63 48
3 13 37 44
3 0 12 63
3 28 59 42
3 31 49 54
3 31 56 42
3 18 45 53
3 29 62 68
3 31 54 36
3 12 44 33
3 22 23 54
3 12 33 63
3 2 65 27
3 2 20 65
3 33 48 63
3 30 39 51
3 33 44 55
3 15 16 67
3 38 65 52
3 8 59 28
3 11 58 51
3 6 9 62
3 6 61 10
3 10 11 64
3 16 43 67
3 17 53 57
3 36 58 60
3 15 67 46
3 7 29 68
3 11 51 64
3 22 54 49
3 9 66 62
3 39 64 51
3 20 52 65
3 30 53 45
3 37 68 66
3 19 4 18
3 5 40 6
3 5 6 47
3 17 18 53
CELL_TYPES 157
1
1
1
1
1
1
1
1
1
1
1
1
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mesh_data.F90
Type: text/x-fortran
Size: 6085 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190228/ac758c00/attachment-0002.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: natural.F90
Type: text/x-fortran
Size: 12372 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190228/ac758c00/attachment-0003.bin>
More information about the petsc-users
mailing list