[petsc-dev] Julia Petsc Wrapper

Barry Smith bsmith at mcs.anl.gov
Sun Jul 19 20:01:15 CDT 2015


> On Jul 19, 2015, at 2:25 PM, Jared Crean <jcrean01 at gmail.com> wrote:
> 
>    Hello,
>        Yeah, Clang.jl was supposed to be the answer to that, but it doesn't handle preprocessor macros and a few other things quite right.
> 
>        Anyways, type detection is working, but when running the tests with PetscScalar as a double precision complex number, Petsc gives the 1 norm of the vector [1.0 + 0im; 2.0 + 1.0im; 3.0 + 2.0im] as 9.0, where as Julia calculates it as 6.841619252963779. It looks like Petsc isn't taking the modulus of the elements of the vector before summing them.  Is this a bug or is there a particular reason for it?

   Hmm, we use the BLAS routine

   261    } else if (type == NORM_1) {
   262      ierr = VecGetArrayRead(xin,&xx);CHKERRQ(ierr);
-> 263      PetscStackCallBLAS("BLASasum",*z   = BLASasum_(&bn,xx,&one));
   264      ierr = VecRestoreArrayRead(xin,&xx);CHKERRQ(ierr);
   265      ierr = PetscLogFlops(PetscMax(n-1.0,0.0));CHKERRQ(ierr);


Turns out for complex, BLAS uses the absolute value of the real part plus the absolute value of the imaginary part

http://www.netlib.org/lapack/explore-html/dc/d8d/scabs1_8f_source.html

It seems everyone uses the conventional definition so we should change PETSc to not use the z/casum() blas routine for the 1 norm

   Thanks for pointing this out.

  Barry


> 
>    Jared Crean
> 
> On 07/14/2015 10:53 PM, Barry Smith wrote:
>>   The Julia folks should give up their delusion that a C package is characterized completely by its .so, it is not, it is defined by its .so and its .h; a .so without its .h it is fairly useless.
>> 
>>   Barry
>> 
>> 
>> 
>>> On Jul 14, 2015, at 9:40 PM, Matthew Knepley <knepley at gmail.com> wrote:
>>> 
>>> On Tue, Jul 14, 2015 at 9:35 PM, Jared Crean <jcrean01 at gmail.com> wrote:
>>>     Hello,
>>>         PETSC_SCALAR is not a symbol either.  I skimmed through the names in the shared library, and it doesn't look like any data type information is there.
>>> 
>>> Damn, yes PETSC_SCALAR and PETSC_REAL are defines. I think we are careful about this so that you can use
>>> a real and complex PETSc together by building multiple versions of the library (no symbol clash). However, I believe
>>> you can use
>>> 
>>>   PetscDataTypeFromString("scalar", &stype, &found);
>>>   PetscDataTypeFromString("complex", &ctype, &found);
>>>   isComplex = stype == ctype;
>>> 
>>>   Thanks,
>>> 
>>>     Matt
>>>  
>>>         Jared Crean
>>> 
>>> 
>>> On 7/14/2015 9:22 PM, Matthew Knepley wrote:
>>>> On Tue, Jul 14, 2015 at 8:03 PM, Jared Crean <jcrean01 at gmail.com> wrote:
>>>>     Hello,
>>>>         PETSC_USE_COMPLEX isn't a symbol in the shared library when Petsc is built with complex scalars, so I don't see a way to access it at runtime. I'll have to write a simple C program that uses sizeof() and write the value to a file.
>>>> 
>>>> That is crazy. How about
>>>> 
>>>>   isComplex = PETSC_COMPLEX == PETSC_SCALAR
>>>> 
>>>>    Matt
>>>>           As for the MPI communicator, the julia MPI package uses a C int to store it, so I will typealias to that to ensure consistency.  If an MPI implementation uses an 8 byte pointer, MPI.jl will have to change too.
>>>> 
>>>>     Jared Crean
>>>> 
>>>> 
>>>> On 7/14/2015 1:04 PM, Matthew Knepley wrote:
>>>>> On Tue, Jul 14, 2015 at 10:56 AM, Jared Crean <jcrean01 at gmail.com> wrote:
>>>>>     Hello everyone,
>>>>>         I got the package in a reasonably working state and Travis testing setup, so I am putting the package up on Github.
>>>>> 
>>>>>         https://github.com/JaredCrean2/PETSc.jl
>>>>> 
>>>>>         There is still a lot more work to do, but its a start.
>>>>> 
>>>>>         A couple questions:
>>>>>         When looking though the code, I noticed the MPI communicator is being passed as a 64 bit integer.  mpi.h typedefs it as an int, so shouldn't it be a 32 bit integer?
>>>>> 
>>>>> Some MPI implementations store the communicator as a pointer, which may be 64 bits. I think the only thing the standard says is
>>>>> that MPI_Comm should be defined.
>>>>>           Also, is there a way to find out at runtime what datatype a PetscScalar is?  It appears PetscDataTypeGetSize does not accept PetscScalar as an argument.
>>>>> 
>>>>> If PETSC_USE_COMPLEX is defined its PETSC_COMPLEX, otherwise its PETSC_REAL. You can also just use sizeof(PetscScalar). What do you
>>>>> want to do?
>>>>> 
>>>>>   Thanks,
>>>>> 
>>>>>      Matt
>>>>>  
>>>>>     Jared Crean
>>>>> 
>>>>> 
>>>>> 
>>>>> On 07/06/2015 09:02 AM, Matthew Knepley wrote:
>>>>>> On Mon, Jul 6, 2015 at 4:59 AM, Patrick Sanan <patrick.sanan at gmail.com> wrote:
>>>>>> I had a couple of brief discussions about this at Juliacon as well. I think it would be useful, but there are a couple of things to think about from the start of any new attempt to do this:
>>>>>> 1. As Jack pointed out, one issue is that the PETSc library must be compiled for a particular precision. This raises some questions - should several versions of the library be built to allow for flexibility?
>>>>>> 2. An issue with wrapping PETSc is always that the flexibility of using the PETSc options paradigm is reduced - how can this be addressed? Could/should an expert user be able to access the options database directly, or would this be too much violence to the wrapper abstraction?
>>>>>> 
>>>>>> I have never understood why this is an issue. Can't you just wrap our interface level, and use the options just as we do?                                               That
>>>>>> is essentially what petsc4py does. What is limiting in this methodology? On the other hand, requiring specific types, ala FEniCS,
>>>>>> is very limiting.
>>>>>> 
>>>>>>    Matt
>>>>>>  On Sat, Jul 4, 2015 at 11:00 PM, Jared Crean <jcrean01 at gmail.com> wrote:
>>>>>> Hello,
>>>>>>      I am a graduate student working on a CFD code written in Julia, and I am interested in using Petsc as a linear solver (and possibly for the non-linear solves as well) for the code.  I discovered the Julia wrapper file Petsc.jl in Petsc and have updated it to work with the current version of Julia and the MPI.jl package, using only MPI for communication (I don't think Julia's internal parallelism will scale well enough, at least not in the near future).
>>>>>> 
>>>>>>      I read the discussion on Github [https://github.com/JuliaLang/julia/issues/2645], and it looks like
>>>>>> there currently is not a complete package to access Petsc from Julia.  With your permission, I would like to use the Petsc.jl file as the basis for developing a package.  My plan is create a lower level interface that exactly wraps Petsc functions, and then construct a higher level interface, probably an object that is a subtype of Julia's AbstractArray, that allows users to store values into Petsc vectors and matrices.  I am less interested in integrating tightly with Julia's existing linear algebra capabilities than ensuring good scalability.  The purpose of the high level interface it simple to populate the vector or matrix.
>>>>>> 
>>>>>>      What do you think, both about using the Petsc.jl file and the  overall approach?
>>>>>> 
>>>>>>      Jared Crean
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>>> -- Norbert Wiener
>>>>> 
>>>>> 
>>>>> 
>>>>> -- 
>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results                             to which their experiments lead.
>>>>> -- Norbert Wiener
>>>> 
>>>> 
>>>> 
>>>> -- 
>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>> -- Norbert Wiener
>>> 
>>> 
>>> 
>>> -- 
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
> 




More information about the petsc-dev mailing list