[petsc-users] petsc-users Digest, Vol 27, Issue 47

Barry Smith bsmith at mcs.anl.gov
Wed Mar 23 20:47:52 CDT 2011


  I checked petsc-dev and it now prints out a useful error message in this case indicating the solution.

   Barry

On Mar 23, 2011, at 8:42 PM, Kontsantinos Kontzialis wrote:

> On 03/23/2011 02:48 PM, petsc-users-request at mcs.anl.gov wrote:
>> Send petsc-users mailing list submissions to
>> 	petsc-users at mcs.anl.gov
>> 
>> To subscribe or unsubscribe via the World Wide Web, visit
>> 	https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>> or, via email, send a message with subject or body 'help' to
>> 	petsc-users-request at mcs.anl.gov
>> 
>> You can reach the person managing the list at
>> 	petsc-users-owner at mcs.anl.gov
>> 
>> When replying, please edit your Subject line so it is more specific
>> than "Re: Contents of petsc-users digest..."
>> 
>> 
>> Today's Topics:
>> 
>>    1. Re:  matrix free and preconditioner (Barry Smith)
>>    2.   MatCreateMPIAIJWithSplitArrays (Alejandro Marcos Arag?n)
>>    3. Re:  MatCreateMPIAIJWithSplitArrays (Jed Brown)
>>    4. Re:  VecView doesn't work properly with DA global	vectors
>>       (??????? ???????)
>>    5.  AMG for PETSc (Nun ion)
>> 
>> 
>> ----------------------------------------------------------------------
>> 
>> Message: 1
>> Date: Tue, 22 Mar 2011 22:29:50 -0500
>> From: Barry Smith<bsmith at mcs.anl.gov>
>> Subject: Re: [petsc-users] matrix free and preconditioner
>> To: PETSc users list<petsc-users at mcs.anl.gov>
>> Message-ID:<F61C52D2-56B1-4470-802F-F9D909D33B35 at mcs.anl.gov>
>> Content-Type: text/plain; charset=us-ascii
>> 
>> 
>>   Call MatMFFDSetFromOptions(sys.J); after the MatCreateSNESMF().
>> 
>>   Your jacobian_matrix() HAS to call MatAssemblyBegin/End() on the first matrix passed into it (this one passed in will be sys.J) as well as whatever it does for the sys.P matrix.
>> 
>>    If this does not resolve the problem please let us know.
>> 
>>    Barry
>> 
>> 
>> On Mar 22, 2011, at 9:46 PM, Kontsantinos Kontzialis wrote:
>> 
>>> Dear all,
>>> 
>>> I'm using a matrix free formulation of the Newton-Krylov method.
>>>     My code is running when I use a constant matrix as a preconditioner.
>>>     However, when I want to use another matrix (problem specific) I read to the
>>>     manual that I have to use the -snes_mf_operator option, with
>>>     appropriately setting
>>>     snessetjacobian function. I do that in my code as follows:
>>>     ierr = MatCreateSNESMF(sys.snes,&sys.J);
>>>          CHKERRQ(ierr);
>>>     ierr = SNESSetJacobian(sys.snes, sys.J, sys.P, jacobian_matrix,&sys);
>>>           CHKERRQ(ierr);
>>>     I have preallocated space for matrix sys.P, but when I start running I
>>>     get the following error:
>>>     [1]PETSC ERROR: --------------------- Error Message
>>>     ------------------------------------
>>>     [1]PETSC ERROR: Null argument, when expecting valid pointer!
>>>     [1]PETSC ERROR: Null Object: Parameter # 1!
>>>     [1]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20
>>> 
>>> - Ignored:
>>>     14:26:37 CST 2010
>>>     [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>     [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>     [1]PETSC ERROR: See docs/index.html for manual pages.
>>>     [1]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [1]PETSC ERROR: ./hoac_forth on a linux-gnu named localhost by
>>>     kontzialis Wed Mar 23 04:42:52 2011
>>>     [1]PETSC ERROR: Libraries linked from
>>>     /home/kontzialis/PETSC/petsc-3.1-p7/linux-gnu-c-debug/lib
>>>     [1]PETSC ERROR: Configure run at Mon Feb 21 14:55:24 2011
>>>     [1]PETSC ERROR: Configure options --with-debugging=1 --with-cc=mpicc
>>>     --with-fc=mpif90 --with-shared=1 --with-shared-libraries --wi[0]PETSC
>>>     ERROR: --------------------- Error Message
>>>     ------------------------------------
>>>     [0]PETSC ERROR: Null argument, when expecting valid pointer!
>>>     [0]PETSC ERROR: Null Object: Parameter # 1!
>>>     [0]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20
>>>     14:26:37 CST 2010
>>>     [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>     [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>     [0]PETSC ERROR: See docs/index.html for manual pages.
>>>     [0]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [0]PETSC ERROR: ./hoac_forth on a linux-gnu named localhost by
>>>     kontzialis Wed Mar 23 04:42:52 2011
>>>     [0]PETSC ERROR: Libraries linked from
>>>     /home/kontzialis/PETSC/petsc-3.1-p7/linux-gnu-c-debug/lib
>>>     [0]PETSC ERROR: Configure run at Mon Feb 21 14:55:24 2011
>>>     [0]PETSC ERROR: Configure options --with-debugging=1 --with-cc=mpicc
>>>     --with-fc=mpif90 --with-shared=1 --with-shared-libraries
>>>     --with-large-file-io=1 --with-precision=double --with-blacs=1
>>>     --download-blacs=yes --download-f-blas-lapack=yes --with-plapack=1
>>>     --download-plapack=yes --with-scalapack=1 --download-scalapack=yes
>>>     --with-superlu=1 --download-superlu=yes --with-superlu_dist=1
>>>     --download-superlu_dist=yes --with-ml=1 --download-ml=yes
>>>     --with-umfpack=1 --download-umfpack=yes --with-mpi=1 --download-mpich=1
>>>     --with-sundials=1 --download-sundials=1 --with-parmetis=1
>>>     --download-parmetis=1 --with-hypre=1 --download-hypre=1
>>>     [1]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [1]PETSC ERROR: PetscObjectGetComm() line 34 in src/sys/objects/gcomm.c
>>>     [1]PETSC ERROR: VecNormBegin() line 495 in src/vec/vec/utils/comb.c
>>>     [1]PETSC ERROR: MatMFFDCompute_WP() line 73 in src/mat/impls/mffd/wp.c
>>>     [1]PETSC ERROR: MatMult_MFFD() line 315 in src/mat/impls/mffd/mffd.c
>>>     [1]PETSC ERROR: MatMult() line 1899 in src/mat/interface/matrix.c
>>>     [1]PETSC ERROR: PCApplyBAorAB() line 585 in src/ksp/pc/interface/precon.c
>>>     [1]PETSC ERROR: Gth-large-file-io=1 --with-precision=double
>>>     --with-blacs=1 --download-blacs=yes --download-f-blas-lapack=yes
>>>     --with-plapack=1 --download-plapack=yes --with-scalapack=1
>>>     --download-scalapack=yes --with-superlu=1 --download-superlu=yes
>>>     --with-superlu_dist=1 --download-superlu_dist=yes --with-ml=1
>>>     --download-ml=yes --with-umfpack=1 --download-umfpack=yes --with-mpi=1
>>>     --download-mpich=1 --with-sundials=1 --download-sundials=1
>>>     --with-parmetis=1 --download-parmetis=1 --with-hypre=1 --download-hypre=1
>>>     [0]PETSC ERROR:
>>>     ------------------------------------------------------------------------
>>>     [0]PETSC ERROR: PetscObjectGetComm() line 34 in src/sys/objects/gcomm.c
>>>     [0]PETSC ERROR: VecNormBegin() line 495 in src/vec/vec/utils/comb.c
>>>     [0]PETSC ERROR: MatMFFDCompute_WP() line 73 in src/mat/impls/mffd/wp.c
>>>     [0]PETSC ERROR: MatMult_MFFD() line 315 in src/mat/impls/mffd/mffd.c
>>>     [0]PETSC ERROR: MatMult() line 1899 in src/mat/interface/matrix.c
>>>     [0]PETSC ERROR: PCApplyBAorAB() line 585 in src/ksp/pc/interface/precon.c
>>>     [0]PETSC ERROR: GMREScycle() line 161 in src/ksp/ksp/impls/gmres/gmres.c
>>>     [1]PETSC ERROR: KSPSolve_GMRES() line 241 in src/ksp/ksp/impls/gmres/gmres.c
>>>     [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c
>>>     [1]PETSC ERROR: SNES_KSPSolve() line 2944 in src/snes/interface/snes.c
>>>     [1]PETSC ERROR: SNESSolve_LS() line 191 in src/snes/impls/ls/ls.c
>>>     [1]PETSC ERROR: SNESSolve() line 2255 in src/snes/interface/snes.c
>>>     [1]PETSC ERROR: User provided function() line 33 in
>>>     "unknowndirectory/"../crank_nickolson.c
>>>     [1]PETSC ERROR: User provided function() line 84 in
>>>     "unknowndirectory/"../implicit_time.c
>>>     [1]PETSC ERROR: User provided function() line 1158 in
>>>     "unknowndirectory/"../hoac.c
>>>     MREScycle() line 161 in src/ksp/ksp/impls/gmres/gmres.c
>>>     [0]PETSC ERROR: KSPSolve_GMRES() line 241 in src/ksp/ksp/impls/gmres/gmres.c
>>>     [0]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c
>>>     [0]PETSC ERROR: SNES_KSPSolve() line 2944 in src/snes/interface/snes.c
>>>     [0]PETSC ERROR: SNESSolve_LS() line 191 in src/snes/impls/ls/ls.c
>>>     [0]PETSC ERROR: SNESSolve() line 2255 in src/snes/interface/snes.c
>>>     [0]PETSC ERROR: User provided function() line 33 in
>>>     "unknowndirectory/"../crank_nickolson.c
>>>     [0]PETSC ERROR: User provided function() line 84 in
>>>     "unknowndirectory/"../implicit_time.c
>>>     [0]PETSC ERROR: User provided function() line 1158 in
>>>     "unknowndirectory/"../hoac.c
>>> 
>>>     Please help.
>>> 
>>>     Costas
>>> 
>>> 
>>> - Done.
>>> 
>>> 
>>> 
>>> matrix free and preconditioner.eml
>>> Subject: matrix free and preconditioner
>>> From: Kontsantinos Kontzialis<ckontzialis at lycos.com>
>>> Date: Wed, 23 Mar 2011 04:43:43 +0200
>>> To: petsc-users-request at mcs.anl.gov
>>> Dear Petsc team,
>>> 
>>>  I'm using a matrix free formulation of the Newton-Krylov method.
>>> My code is running when I use a constant matrix as a preconditioner.
>>> However, when I want to use another matrix (problem specific) I read to the
>>> manual that I have to use the -snes_mf_operator option, with appropriately setting
>>> snessetjacobian function. I do that in my code as follows:
>>> 
>>> ierr = MatCreateSNESMF(sys.snes,&sys.J);
>>>     CHKERRQ(ierr);
>>> 
>>> ierr = SNESSetJacobian(sys.snes, sys.J, sys.P, jacobian_matrix,&sys);
>>>      CHKERRQ(ierr);
>>> 
>>> I have preallocated space for matrix sys.P, but when I start running I get the following error:
>>> 
>>> [1]PETSC ERROR: --------------------- Error Message ------------------------------------
>>> [1]PETSC ERROR: Null argument, when expecting valid pointer!
>>> [1]PETSC ERROR: Null Object: Parameter # 1!
>>> [1]PETSC ERROR: ------------------------------------------------------------------------
>>> [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010
>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [1]PETSC ERROR: See docs/index.html for manual pages.
>>> [1]PETSC ERROR: ------------------------------------------------------------------------
>>> [1]PETSC ERROR: ./hoac_forth on a linux-gnu named localhost by kontzialis Wed Mar 23 04:42:52 2011
>>> [1]PETSC ERROR: Libraries linked from /home/kontzialis/PETSC/petsc-3.1-p7/linux-gnu-c-debug/lib
>>> [1]PETSC ERROR: Configure run at Mon Feb 21 14:55:24 2011
>>> [1]PETSC ERROR: Configure options --with-debugging=1 --with-cc=mpicc --with-fc=mpif90 --with-shared=1 --with-shared-libraries --wi[0]PETSC ERROR: --------------------- Error Message ------------------------------------
>>> [0]PETSC ERROR: Null argument, when expecting valid pointer!
>>> [0]PETSC ERROR: Null Object: Parameter # 1!
>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>> [0]PETSC ERROR: ./hoac_forth on a linux-gnu named localhost by kontzialis Wed Mar 23 04:42:52 2011
>>> [0]PETSC ERROR: Libraries linked from /home/kontzialis/PETSC/petsc-3.1-p7/linux-gnu-c-debug/lib
>>> [0]PETSC ERROR: Configure run at Mon Feb 21 14:55:24 2011
>>> [0]PETSC ERROR: Configure options --with-debugging=1 --with-cc=mpicc --with-fc=mpif90 --with-shared=1 --with-shared-libraries --with-large-file-io=1 --with-precision=double --with-blacs=1 --download-blacs=yes --download-f-blas-lapack=yes --with-plapack=1 --download-plapack=yes --with-scalapack=1 --download-scalapack=yes --with-superlu=1 --download-superlu=yes --with-superlu_dist=1 --download-superlu_dist=yes --with-ml=1 --download-ml=yes --with-umfpack=1 --download-umfpack=yes --with-mpi=1 --download-mpich=1 --with-sundials=1 --download-sundials=1 --with-parmetis=1 --download-parmetis=1 --with-hypre=1 --download-hypre=1
>>> [1]PETSC ERROR: ------------------------------------------------------------------------
>>> [1]PETSC ERROR: PetscObjectGetComm() line 34 in src/sys/objects/gcomm.c
>>> [1]PETSC ERROR: VecNormBegin() line 495 in src/vec/vec/utils/comb.c
>>> [1]PETSC ERROR: MatMFFDCompute_WP() line 73 in src/mat/impls/mffd/wp.c
>>> [1]PETSC ERROR: MatMult_MFFD() line 315 in src/mat/impls/mffd/mffd.c
>>> [1]PETSC ERROR: MatMult() line 1899 in src/mat/interface/matrix.c
>>> [1]PETSC ERROR: PCApplyBAorAB() line 585 in src/ksp/pc/interface/precon.c
>>> [1]PETSC ERROR: Gth-large-file-io=1 --with-precision=double --with-blacs=1 --download-blacs=yes --download-f-blas-lapack=yes --with-plapack=1 --download-plapack=yes --with-scalapack=1 --download-scalapack=yes --with-superlu=1 --download-superlu=yes --with-superlu_dist=1 --download-superlu_dist=yes --with-ml=1 --download-ml=yes --with-umfpack=1 --download-umfpack=yes --with-mpi=1 --download-mpich=1 --with-sundials=1 --download-sundials=1 --with-parmetis=1 --download-parmetis=1 --with-hypre=1 --download-hypre=1
>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>> [0]PETSC ERROR: PetscObjectGetComm() line 34 in src/sys/objects/gcomm.c
>>> [0]PETSC ERROR: VecNormBegin() line 495 in src/vec/vec/utils/comb.c
>>> [0]PETSC ERROR: MatMFFDCompute_WP() line 73 in src/mat/impls/mffd/wp.c
>>> [0]PETSC ERROR: MatMult_MFFD() line 315 in src/mat/impls/mffd/mffd.c
>>> [0]PETSC ERROR: MatMult() line 1899 in src/mat/interface/matrix.c
>>> [0]PETSC ERROR: PCApplyBAorAB() line 585 in src/ksp/pc/interface/precon.c
>>> [0]PETSC ERROR: GMREScycle() line 161 in src/ksp/ksp/impls/gmres/gmres.c
>>> [1]PETSC ERROR: KSPSolve_GMRES() line 241 in src/ksp/ksp/impls/gmres/gmres.c
>>> [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c
>>> [1]PETSC ERROR: SNES_KSPSolve() line 2944 in src/snes/interface/snes.c
>>> [1]PETSC ERROR: SNESSolve_LS() line 191 in src/snes/impls/ls/ls.c
>>> [1]PETSC ERROR: SNESSolve() line 2255 in src/snes/interface/snes.c
>>> [1]PETSC ERROR: User provided function() line 33 in "unknowndirectory/"../crank_nickolson.c
>>> [1]PETSC ERROR: User provided function() line 84 in "unknowndirectory/"../implicit_time.c
>>> [1]PETSC ERROR: User provided function() line 1158 in "unknowndirectory/"../hoac.c
>>> MREScycle() line 161 in src/ksp/ksp/impls/gmres/gmres.c
>>> [0]PETSC ERROR: KSPSolve_GMRES() line 241 in src/ksp/ksp/impls/gmres/gmres.c
>>> [0]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c
>>> [0]PETSC ERROR: SNES_KSPSolve() line 2944 in src/snes/interface/snes.c
>>> [0]PETSC ERROR: SNESSolve_LS() line 191 in src/snes/impls/ls/ls.c
>>> [0]PETSC ERROR: SNESSolve() line 2255 in src/snes/interface/snes.c
>>> [0]PETSC ERROR: User provided function() line 33 in "unknowndirectory/"../crank_nickolson.c
>>> [0]PETSC ERROR: User provided function() line 84 in "unknowndirectory/"../implicit_time.c
>>> [0]PETSC ERROR: User provided function() line 1158 in "unknowndirectory/"../hoac.c
>>> 
>>> Please help.
>>> 
>>> Costas
>> 
>> 
>> ------------------------------
>> 
>> Message: 2
>> Date: Wed, 23 Mar 2011 10:59:11 +0100
>> From: Alejandro Marcos Arag?n<alejandro.aragon at gmail.com>
>> Subject: [petsc-users]  MatCreateMPIAIJWithSplitArrays
>> To: petsc-users at mcs.anl.gov
>> Message-ID:<CD788A44-4862-4E97-9FC0-5CC4FBC937DB at gmail.com>
>> Content-Type: text/plain; charset=iso-8859-1
>> 
>> Hi PETSc users,
>> 
>> My code uses a custom sparse matrix format that uses a hashed container for the entries of the matrix. The interface can then create a matrix in compressed row (or compressed column) storage format. That is, it can give me the three arrays needed to represent the sparse matrix in these formats. I would like to use the PETSc parallel solver, so then I thought that it would be good to try the MatCreateMPIAIJWithSplitArrays function so that I don't have to copy the values to the PETSc matrix again.
>> 
>> Now my question to you is that I really don't get the point of having a diagonal and off-diagonal blocks of the sparse matrix. In the compressed row storage format, there is no distinction between these two blocks. Besides, I don't think there is a clear way to determine which is the boundary between these two blocks. Can someone point me how I should use this function, or if there is a better function that can take the three arrays that I have at this point?
>> 
>> Also, since the sparse matrix in each process is the result of a finite element assembly routine, some rows are overlapped among the processes (there are several finite element nodes shared among the processes). At this point using the MatSetValues with the ADD_VALUES flag works fine, but I want to make sure that if I use the MatCreateMPIAIJWithSplitArrays (where I need to set the number of local rows) I can still get this behavior. In other words, if I sum the number of local rows in each process, I get a total number of rows that is greater than the number of global rows because of superposition.
>> 
>> Thank you all,
>> 
>> Alejandro M. Arag?n, Ph.D.
>> 
>> ------------------------------
>> 
>> Message: 3
>> Date: Wed, 23 Mar 2011 12:34:36 +0100
>> From: Jed Brown<jed at 59A2.org>
>> Subject: Re: [petsc-users] MatCreateMPIAIJWithSplitArrays
>> To: PETSc users list<petsc-users at mcs.anl.gov>
>> Message-ID:
>> 	<AANLkTi=xxnbz5e9tsEkOf2ya9L=36rTyPT3aOQtSULu_ at mail.gmail.com>
>> Content-Type: text/plain; charset="utf-8"
>> 
>> 2011/3/23 Alejandro Marcos Arag?n<alejandro.aragon at gmail.com>
>> 
>>> My code uses a custom sparse matrix format that uses a hashed container for
>>> the entries of the matrix. The interface can then create a matrix in
>>> compressed row (or compressed column) storage format. That is, it can give
>>> me the three arrays needed to represent the sparse matrix in these formats.
>>> I would like to use the PETSc parallel solver, so then I thought that it
>>> would be good to try the MatCreateMPIAIJWithSplitArrays function so that I
>>> don't have to copy the values to the PETSc matrix again.
>>> 
>> Have your hashed format generate CSR rows and insert those one row at a time
>> using MatSetValues(). It will be very fast and avoid the copy of using
>> MatCreateMPIAIJWithArrays.
>> 
>> 
>>> Now my question to you is that I really don't get the point of having a
>>> diagonal and off-diagonal blocks of the sparse matrix.
>>> 
>> Storing the diagonal and off-diagonal part separately makes MatMult more
>> tolerant to network latency and permits block Jacobi preconditioning without
>> needing to copy out the local blocks.
>> 
>> 
>>> In the compressed row storage format, there is no distinction between these
>>> two blocks. Besides, I don't think there is a clear way to determine which
>>> is the boundary between these two blocks. Can someone point me how I should
>>> use this function, or if there is a better function that can take the three
>>> arrays that I have at this point?
>>> 
>>> Also, since the sparse matrix in each process is the result of a finite
>>> element assembly routine, some rows are overlapped among the processes
>>> (there are several finite element nodes shared among the processes). At this
>>> point using the MatSetValues with the ADD_VALUES flag works fine, but I want
>>> to make sure that if I use the MatCreateMPIAIJWithSplitArrays (where I need
>>> to set the number of local rows) I can still get this behavior. In other
>>> words, if I sum the number of local rows in each process, I get a total
>>> number of rows that is greater than the number of global rows because of
>>> superposition.
>>> 
>> Just use MatSetValues(), it will take care of parallel assembly.
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL:<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110323/ae87ba9e/attachment-0001.htm>
>> 
>> ------------------------------
>> 
>> Message: 4
>> Date: Wed, 23 Mar 2011 15:10:18 +0300
>> From: ??????? ???????<ram at ibrae.ac.ru>
>> Subject: Re: [petsc-users] VecView doesn't work properly with DA
>> 	global	vectors
>> To: Jed Brown<jed at 59a2.org>
>> Cc: PETSc users list<petsc-users at mcs.anl.gov>
>> Message-ID:
>> 	<AANLkTimVxeog77HK2DZc2ht48onqVGamJdAJg_pziq_J at mail.gmail.com>
>> Content-Type: text/plain; charset="koi8-r"
>> 
>> Thank You, Jed! I'll follow your advice. Now im having some problems with
>> using nondefault viewers, but i think i can cope with it. Thanks
>> 
>> 2011/3/22 Jed Brown<jed at 59a2.org>
>> 
>>> On Tue, Mar 22, 2011 at 10:41, ??????? ???????<ram at ibrae.ac.ru>  wrote:
>>> 
>>>> Thank You for you answer! Yes now I can see from the output, that VecView
>>>> uses natural ordering, but its also very importaint to understand where (on
>>>> which processor) elements are finaly stored. If I got it in a right way,
>>>> VecView uses natural ordering and gives the WRONG information about actual
>>>> distribution of memory among processors.
>>> 
>>> The natural ordering is different from the PETSc ordering. If you want to
>>> view the Vec in the order it is stored in, use
>>> 
>>> PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE);
>>> VecView(X,viewer);
>>> PetscViewerPopFormat(viewer);
>>> 
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL:<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110323/5591a296/attachment-0001.htm>
>> 
>> ------------------------------
>> 
>> Message: 5
>> Date: Wed, 23 Mar 2011 07:48:45 -0500
>> From: Nun ion<m.skates82 at gmail.com>
>> Subject: [petsc-users] AMG for PETSc
>> To: petsc-users at mcs.anl.gov
>> Message-ID:
>> 	<AANLkTi=E7xN_9+EJiwX4eeSWEDOa8rPPvC4m0V9dPg=r at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>> 
>> Hello all,
>> 
>> I am interested in using AMG for PETSc, where my matrices are complex.  I
>> looked at the documentation for Hypre and I'm not sure that it supports
>> complex arithmetic.  I also looked at ML from trilinos, however the only
>> work around would be to recast my problem as a real system (doubling the
>> MATVEcs).  Any suggestions?
>> 
>> Thanks
>> 
>> Mark
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL:<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110323/f82e22d7/attachment.htm>
>> 
>> ------------------------------
>> 
>> _______________________________________________
>> petsc-users mailing list
>> petsc-users at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>> 
>> 
>> End of petsc-users Digest, Vol 27, Issue 47
>> *******************************************
>> 
> Barry,
> 
> It works now. Thank you!
> 
> Costas



More information about the petsc-users mailing list