[petsc-users] petsc-users Digest, Vol 33, Issue 13
amrit poudel
amrit_pou at hotmail.com
Wed Sep 7 12:05:16 CDT 2011
Hello PETSc users,
I am trying to compile PETSC with MUMPS, but I am not sure how I
compile MUMPS libraries in MAC OSX 10.6.8 . I know that I need BLACS
and SCALAPACK libraries, but I am having trouble understanding how to
compile these libraries in MAC OSX 10.6.8. I saw that some users here were able to
do this in MAC OSX 10.6.8. Would any one of you who have successfully compiled these libraries in MAC OSX 10.6.8 mind sharing their MAKE FILES ? I will
truly appreciate your help. I am sorry if it is inappropriate to ask for
a make file. It is just that I am not understanding where to start from.
NOTE : I already have PETSc compiled and installed and is running fine (in Intel core 2 duo MAC OSX 10.6.8), but I am in need of an efficient direct solver (that can run in a multiprocessor computer, or in a cluster with distributed or shared memory), and surprisingly, PETSc does not come with any default direct solver package, unlike an iterative solver. I was also wondering whether UMFPACK can be used for solving an extremely large sparse system of linear equations in a multiprocessor computer or in a cluster with distributed or shared memory. SuperLU_Dist or SuperLU_MT installation seems more confounding than any other packages, so I am trying to stay away from them.
Thanks for any help, and sorry if my request for make file is inappropriate.
-Amrit
Date: Wed, 7 Sep 2011 19:41:16 +0300
From: ckontzialis at lycos.com
To: petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] petsc-users Digest, Vol 33, Issue 13
On 09/07/2011 04:53 PM, petsc-users-request at mcs.anl.gov wrote:
Send petsc-users mailing list submissions to
petsc-users at mcs.anl.gov
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
or, via email, send a message with subject or body 'help' to
petsc-users-request at mcs.anl.gov
You can reach the person managing the list at
petsc-users-owner at mcs.anl.gov
When replying, please edit your Subject line so it is more specific
than "Re: Contents of petsc-users digest..."
Today's Topics:
1. Re: compilation with mumps on Mac OS 10.6.8 (Kyunghoon Lee)
2. Build Petsc with single precision (Sravya Tirukkovalur)
3. Re: Build Petsc with single precision (Satish Balay)
4. Re: Still no luck (Satish Balay)
5. Coloring of a parallel matrix (Kostas Kontzialis)
6. Re: Coloring of a parallel matrix (Jed Brown)
7. Re: Coloring of a parallel matrix (Barry Smith)
8. petsc with slepc (Micheal Lysaght)
----------------------------------------------------------------------
Message: 1
Date: Wed, 7 Sep 2011 12:04:38 +0800
From: Kyunghoon Lee <aeronova.mailing at gmail.com>
Subject: Re: [petsc-users] compilation with mumps on Mac OS 10.6.8
To: PETSc users list <petsc-users at mcs.anl.gov>
Message-ID:
<CA+ZVpt9npcK+xJg1-Sbs__BvweVV_9tEp48ooc_BYza+m5G5gg at mail.gmail.com>
Content-Type: text/plain; charset="windows-1252"
Thanks for the help. Now I finished the installation successfully. :)
K. Lee.
On Wed, Sep 7, 2011 at 11:49 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
On Sep 6, 2011, at 10:41 PM, Kyunghoon Lee wrote:
Hi Barry,
Thanks for the reply. I did install g95 through macports. (I could not
find gfortran.) I wonder g95 can do the job for me.
g95 might work, give it a try. BTW: on Macs you don't need to provide
blas/lapack with --download; Apple provides them on the system.
Re SuperLU_Dist, all I know is I need mumps to deal with complex numbers.
SuperLU_Dist can also be used with complex numbers (at least with
PETSc-dev/petsc-3.2)
K. Lee.
On Wed, Sep 7, 2011 at 11:21 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
p.s.
I do not need to compile petsc for FORTRAN.
Sadly you do because the packages MUMPS uses are all Fortran.
So you will need a real Fortran compiler (perhaps gfortran) and you
will also need --download-blacs and you cannot use --download-c-blas-lapack
with scalapack.
Note you can use SuperLU_Dist instead of MUMPS without a Fortran
compiler.
Barry
On Sep 6, 2011, at 10:17 PM, Kyunghoon Lee wrote:
Hello all,
I need to compile petsc with mumps. First I tried
./configure
--prefix=/Users/aeronova/Development/local/lib64/petsc/petsc-3.1-p8
--download-c-blas-lapack=1 ?download-parmetis=1 --download-scalapack=1
--download-mumps=1
but I got the following message:
Fortran error! mpif.h could not be located at: []
So I included "-download-mpich=1," then I got the following:
Should request f-blas-lapack, not --download-c-blas-lapack=yes since
you have a fortran compiler?
After that, I tried several options with/without
--download-f-blas-lapack or --download-mpich=1, but none of them worked out.
I'd appreciate it if someone can help me with this compilation problem with
mumps.
Regards,
K. Lee.
p.s.
I do not need to compile petsc for FORTRAN.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110907/3161e9ae/attachment-0001.htm>
------------------------------
Message: 2
Date: Tue, 6 Sep 2011 23:05:26 -0400
From: Sravya Tirukkovalur <stirukkovalur at rnet-tech.com>
Subject: [petsc-users] Build Petsc with single precision
To: petsc-users at mcs.anl.gov
Message-ID:
<CABjTB1gOBgs4AuF7MFiEuM+_WohTdqOCh=zBS9rWFS_iOzB=Bg at mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"
Hi,
I am trying to build petsc with single precision. The options I am using
are:
./configure options --download-f-blas-lapack=1 --with-x=0 --with-debugging=1
--CFLAGS="-O3 -g -ggdb" --FFLAGS="-O3 -g" --with-hdf5=1 --download-hdf5=1
--with-batch=0 --known-mpi-shared-libraries=no --with-cuda=1
--with-cudac="nvcc -m64" --with-cusp=1 --with-thrust=1 PETSC_ARCH=SOMETHING
Configuration step is being completed successfully. And while doing $make
all, it compiles a few source files and prints an error message which says
cannot build petsc with this options, contact the mailing list.
Did anyone face a similar problem while building petsc with single precision
or has anyone successfully built it?
Thanks
Barry,
I have the following code for implementing the setting up snes and
the numerical evaluation of the jacobian:
ierr = jacobian_matrix_nz_pattern(sys);
CHKERRQ(ierr);
ierr = MatCreateMPIBAIJ(sys.comm, sys.num->nbq[sys.con->n],
sys.ldof, sys.ldof, sys.gdof, sys.gdof,
PETSC_NULL,
sys.idxm, PETSC_NULL, sys.idxn, &sys.P);
CHKERRQ(ierr);
ierr = SNESCreate(sys.comm, &sys.snes);
CHKERRQ(ierr);
ierr = SNESSetFunction(sys.snes, sys.gres[0],
base_residual_implicit, &sys);
CHKERRQ(ierr);
ierr = MatCreateSNESMF(sys.snes, &sys.J);
CHKERRQ(ierr);
ierr = MatMFFDSetFromOptions(sys.J);
CHKERRQ(ierr);
ierr = jacobian_diff_numerical(sys, &sys.P);
CHKERRQ(ierr);
ISColoring iscoloring;
MatFDColoring fdcoloring;
MatGetColoring(sys.P, MATCOLORING_SL, &iscoloring);
MatFDColoringCreate(sys.P, iscoloring, &fdcoloring);
ISColoringDestroy(iscoloring);
MatFDColoringSetFromOptions(fdcoloring);
ierr = SNESSetJacobian(sys.snes, sys.J, sys.P,
SNESDefaultComputeJacobianColor, fdcoloring);
CHKERRQ(ierr);
As you can see I'm using a matrix free algorithm, However, when I
run the code with with the -snes_mf_operator option I get on return
the following error:
Timestep 0: dt = 0.008, T = 0, Res[rho] = 0, Res[rhou] = 0,
Res[rhov] = 0, Res[E] = 0, CFL = 0.177549
/*********************Stage 1 of SSPIRK (3,4)******************/
0 SNES Function norm 2.755099585674e+01
[4]PETSC ERROR: --------------------- Error Message
------------------------------------
[4]PETSC ERROR: Object is in wrong state!
[4]PETSC ERROR: Must call MatFDColoringSetFunction()!
I haven't use the MatFDColoringSetFunction. But when I try to put it
I read on the help page:
Notes: This function is usually used automatically by SNES
or TS
(when one uses SNESSetJacobian()
with the argument
SNESDefaultComputeJacobianColor()
or TSSetRHSJacobian()
with the argument TSDefaultComputeJacobianColor())
and only needs to be used
by someone computing a matrix via coloring directly by calling MatFDColoringApply()
and furthermore I cannot figure out what the arguments are for the
(*f) function.
Any help?
Kostas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110907/22c11f40/attachment.htm>
More information about the petsc-users
mailing list