[petsc-users] MatZeroRowsColumns

Samuel Lanthaler s.lanthaler at gmail.com
Fri Nov 24 08:22:02 CST 2017


Perfect, PETSC_NULL_VEC works in fortran. Thank you very much for your 
help! :-)

Cheers,
Sam

On 11/24/2017 03:18 PM, Matthew Knepley wrote:
> On Fri, Nov 24, 2017 at 6:42 AM, Samuel Lanthaler 
> <s.lanthaler at gmail.com <mailto:s.lanthaler at gmail.com>> wrote:
>
>     Ah, great. I didn't understand that "optional" arguments are not
>     optional in the fortran sense of the word. After setting up two
>     vectors to pass to the routine, the program works. Thank you very
>     much!
>
>     One additional question: In the documentation, it is written that
>     I can pass PETSC_NULL for these two vectors, if I don't actually
>     need them. That probably only works in C, but not in fortran. Is
>     there a corresponding argument I can pass to the routine in
>     fortran? It seems, that PETSC_NULL_REAL doesn't work; is there
>     some other PETSC_NULL_XXX that should be passed in this case from
>     fortran? If not, then I'll just stick to creating vectors and
>     distroying them afterwards, which is fine for me as well.
>
>
> You can use PETSC_NULL_VEC
>
>   Thanks,
>
>      Matt
>
>     Cheers,
>     Sam
>
>
>     On 11/24/2017 01:56 AM, Smith, Barry F. wrote:
>
>         MatZeroRowsColumns() as two vector arguments you are missing
>
>
>             On Nov 23, 2017, at 2:43 PM, Samuel Lanthaler
>             <s.lanthaler at gmail.com <mailto:s.lanthaler at gmail.com>> wrote:
>
>             Hi there,
>             I'm new to PETSc and have been trying to do some basic
>             manipulation of matrices in fortran, but don't seem to be
>             able to set a row/column to zero using the
>             MatZeroRowsColumns command: The following is a small
>             example program:
>
>             ! initialize PETSc
>                CALL PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>
>                ! Set up a new matrix
>                m = 3
>                CALL MatCreate(PETSC_COMM_WORLD,matA,ierr); CHKERRQ(ierr);
>                CALL MatSetType(matA,MATMPIAIJ,ierr); CHKERRQ(ierr);
>                CALL
>             MatSetSizes(matA,PETSC_DECIDE,PETSC_DECIDE,m,m,ierr);
>             CHKERRQ(ierr);
>                CALL
>             MatMPIAIJSetPreallocation(matA,3,PETSC_NULL_INTEGER,3,PETSC_NULL_INTEGER,ierr);
>             CHKERRQ(ierr);
>
>                ! set values of matrix
>                vals(1,:) = (/1.,2.,3./)
>                vals(2,:) = (/4.,5.,6./)
>                vals(3,:) = (/7.,8.,9./)
>                !
>                idxm = (/0,1,2/)
>                idxn = (/0,1,2/)
>                !
>                CALL
>             MatSetValues(matA,3,idxm,3,idxn,vals,INSERT_VALUES,ierr);
>             CHKERRQ(ierr);
>                   ! assemble matrix
>                CALL MatAssemblyBegin(matA,MAT_FINAL_ASSEMBLY,ierr);
>             CHKERRQ(ierr);
>                CALL MatAssemblyEnd(matA,MAT_FINAL_ASSEMBLY,ierr);
>             CHKERRQ(ierr);
>                  ! set one row/column to zero, put 6.0d0 on diagonal
>                idone(1) = 2
>                val = 6.0d0
>                CALL MatZeroRowsColumns(matA,1,idone,val,ierr);
>             CHKERRQ(ierr);
>
>             ! finalize PETSc
>                CALL PetscFinalize(PETSC_NULL_CHARACTER,ierr)
>
>             When running the program, I get the following error message:
>
>             [0]PETSC ERROR:
>             ------------------------------------------------------------------------
>             [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>             Violation, probably memory access out of range
>             [0]PETSC ERROR: Try option -start_in_debugger or
>             -on_error_attach_debugger
>             [0]PETSC ERROR: or see
>             http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>             <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
>             [0]PETSC ERROR: or try http://valgrind.org on GNU/linux
>             and Apple Mac OS X to find memory corruption errors
>             [0]PETSC ERROR: likely location of problem given in stack
>             below
>             [0]PETSC ERROR: ---------------------  Stack Frames
>             ------------------------------------
>             [0]PETSC ERROR: Note: The EXACT line numbers in the stack
>             are not available,
>             [0]PETSC ERROR:       INSTEAD the line number of the start
>             of the function
>             [0]PETSC ERROR:       is given.
>             [0]PETSC ERROR: --------------------- Error Message
>             --------------------------------------------------------------
>             [0]PETSC ERROR: Signal received
>             [0]PETSC ERROR: See
>             http://www.mcs.anl.gov/petsc/documentation/faq.html
>             <http://www.mcs.anl.gov/petsc/documentation/faq.html> for
>             trouble shooting.
>             [0]PETSC ERROR: Petsc Release Version 3.8.2, Nov, 09, 2017
>             [0]PETSC ERROR: ./test on a arch-complex-debug named
>             sam-ThinkPad-T450s by sam Thu Nov 23 23:28:15 2017
>             [0]PETSC ERROR: Configure options
>             PETSC_DIR=/home/sam/Progs/petsc-3.8.2
>             PETSC_ARCH=arch-complex-debug --with-scalar-type=complex
>             [0]PETSC ERROR: #1 User provided function() line 0 in
>             unknown file
>             --------------------------------------------------------------------------
>             MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>             with errorcode 59.
>             Would someone be so kind as to tell me what I'm doing
>             wrong? Thank you!
>
>             Best regards,
>             Sam
>
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171124/f09455e7/attachment.html>


More information about the petsc-users mailing list