[petsc-users] Question about PETSc installs and MPI

Francis Poulin fpoulin at uwaterloo.ca
Sat Jul 19 13:09:33 CDT 2014


Hello Satish,

Thanks for the help on both counts.  

1) I installed csh and petscmpiexec works, but I think I will stick to mpirun

2) Your modifications of the code make things work beautifully!

Again, sorry to bug you but the help is greatly appreciated.

Cheers, Francis

------------------
Francis Poulin
Associate Professor
Associate Chair, Undergraduate Studies
Department of Applied Mathematics
University of Waterloo

email:           fpoulin at uwaterloo.ca
Web:            https://uwaterloo.ca/poulin-research-group/
Telephone:  +1 519 888 4567 x32637


________________________________________
From: Satish Balay [balay at mcs.anl.gov]
Sent: Saturday, July 19, 2014 11:52 AM
To: petsc-users
Cc: Francis Poulin
Subject: Re: [petsc-users] Question about PETSc installs and MPI

And it appears the original code was tested with 'real' numbers and petsc-3.4. It was
not really tested with complex numbers. I had to make the following changes for
it to run with complex & petsc-3.5

Satish

---------

balay at es^/scratch/balay/test $ ../petsc-3.5.0/arch-linux2-c-debug/bin/mpiexec -n 8 valgrind --tool=memcheck -q ./main
Dot Product Check...

    Dot Product (should be 2*256) = 512.000000

Done.

Derivative Check...

           Min (should be 0; if halos not defined properly, would be -1.0/dDeltaX) = 0.000000
           Max (should be 0; if halos not defined properly, would be +1.0/dDeltaX) = 0.000000

Done.
balay at es^/scratch/balay/test $ diff -Nru main.cpp.orig main.cpp
--- main.cpp.orig            2014-07-19 10:48:21.072486076 -0500
+++ main.cpp                 2014-07-19 10:47:26.977985006 -0500
@@ -31,9 +31,9 @@
   // DMDA Environment
   //////////////////////////////////////////////////////////////////////////
   // Boundary Conditions
-  DMDABoundaryType bx = DMDA_BOUNDARY_GHOSTED;
-  DMDABoundaryType by = DMDA_BOUNDARY_NONE; // no boundary points since nGx = 1
-  DMDABoundaryType bz = DMDA_BOUNDARY_NONE; // no boundary points since nGz = 1
+  DMBoundaryType bx = DM_BOUNDARY_GHOSTED;
+  DMBoundaryType by = DM_BOUNDARY_NONE; // no boundary points since nGx = 1
+  DMBoundaryType bz = DM_BOUNDARY_NONE; // no boundary points since nGz = 1

        //////////////////////////////////////////////////////////////////////////
        // Stencil type
@@ -96,7 +96,7 @@
   DMDAGetCorners(m_3da, &si, 0, 0, &ei, 0, 0);

        // Access the elements of the local arrays as C++ multi-dim. array structures
-       double ***pvecA, ***pvecB;
+       PetscScalar ***pvecA, ***pvecB;
        DMDAVecGetArray(m_3da, m_vecA, &pvecA);
        DMDAVecGetArray(m_3da, m_vecB, &pvecB);

@@ -132,7 +132,7 @@
   VecAssemblyEnd(m_gTempB);

        // Take dot product
-       double dDotProduct;
+       PetscScalar dDotProduct;
        VecDot(m_gTempA, m_gTempB, &dDotProduct);

        // Output dot product to check
@@ -163,7 +163,7 @@
   // Alternative (local vector): VecDuplicate(m_vecA, &m_vecDx);

        // Access the data in the vectors by using pointers
-       double ***pDx;
+       PetscScalar ***pDx;
        DMDAVecGetArray(m_3da, m_vecA, &pvecA);
        DMDAVecGetArray(m_3da, m_vecDx, &pDx);

balay at es^/scratch/balay/test $



On Sat, 19 Jul 2014, Satish Balay wrote:

> On Sat, 19 Jul 2014, Francis Poulin wrote:
>
> > Hello Barry,
> >
> > I was one of the two people that had difficulties with getting the correct results with John's code.  Previously, I didn't have valgrind installed so I installed it using apt-get.  Then I configured it using the following:
> >
> > ./configure --with-scalar-type=complex --with-cc=gcc -—with-cxx=c++ --with-fc=gfortran --with-c2html=0 --download-mpich --download-scalapack --download-hypre
> >
> > This is on ubuntu and is different from what I tried before in that now I am downloading mpich, scalapack and hypre.   I decided to download scalapack since that seems like it could be useful.  I was told that HYPRE doesn't work with complex variables.  Too bad, but not a big deal.
>
> PETSc does not use scalapack. Its useful only if you are using mumps..
>
> --download-metis --download-parmetis --download-scalapack --download-mumps
>
> >
> > It completes the configure, make all and make test and even gives me the figures of the parallel efficiency (or not quite efficiency maybe).  I didn't catch any errors, but there are possible errors in the log.  When I went to try making an example I found that I can't use petscmpiexec to run anything in serial or parallel.
> >
> > fpoulin at vortex:~/software/petsc/src/ts/examples/tutorials$ /home/fpoulin/software/petsc/bin/petscmpiexec -n 1 ./ex1
> > -bash: /home/fpoulin/software/petsc/bin/petscmpiexec: /bin/csh: bad interpreter: No such file or directory
>
> Perhaps you do not have csh installed on this machine. You can use mpiexec directly
>
> ./ex1
> /home/fpoulin/software/petsc/arch-linux2-c-debug/bin/mpiexec -n 1 ./ex1
> /home/fpoulin/software/petsc/arch-linux2-c-debug/bin/mpiexec -n 2 ./ex1
>
> Satish
>
> >
> > I am sorry to bother you with this.  I am also having issues with my installation on my mac but I thought if i can figure this one out then maybe I will have a better idea what's wrong with the other.
> >
> > Thank,
> > Francis
> >
> >
> > ------------------
> > Francis Poulin
> > Associate Professor
> > Associate Chair, Undergraduate Studies
> > Department of Applied Mathematics
> > University of Waterloo
> >
> > email:           fpoulin at uwaterloo.ca
> > Web:            https://uwaterloo.ca/poulin-research-group/
> > Telephone:  +1 519 888 4567 x32637
> >
> >
> > ________________________________________
> > From: Barry Smith [bsmith at mcs.anl.gov]
> > Sent: Friday, July 18, 2014 9:57 PM
> > To: John Yawney
> > Cc: petsc-users at mcs.anl.gov; Francis Poulin; Kim Usi
> > Subject: Re: [petsc-users] Question about PETSc installs and MPI
> >
> >    I ran the program on linux with 1,2, 4 processes under valgrind for both types of boundary conditions and it ran fine.
> >
> >    Suggest your colleagues do a test configure of PETSc using —download-mpich and see if they still get the problem or if it runs ok.
> >
> >    The can also run with valgrind and see what it reports. http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> >
> >    Barry
> >
> > On Jul 18, 2014, at 8:16 PM, John Yawney <jyawney123 at gmail.com> wrote:
> >
> > > Hello,
> > >
> > > I had a question about PETSc installations. On my local computer I configured PETSc (v 3.4.2) using the options:
> > >
> > > ./configure --with-cc=mpicc --with-cxx=mpic++ --download-f-blas-lapack --download-mpich --download-hypre
> > >
> > > I wrote a test program that defines a vector using DMDAs, computes a dot product, exchanges halo elements, and computes a low-order FD derivative of the vector. Under my installation of PETSc everything works fine. For some reason, when my colleagues run the program, they get segmentation fault errors. If they change the y and z boundary types to GHOSTED as well, they get the program to run until the end (seg faults at the end though) but they get a local value of the dot product. I've attached the main.cpp file for this script.
> > >
> > > When they installed their versions of PETSc they didn't use the --download-mpich option but instead used either:
> > > ./configure --download-f-blas-lapack --with-scalar-type=complex
> > > or with the option: --with-mpi-dir=/home/kim/anaconda/pkgs/mpich2-1.3-py27_0
> > >
> > > Could this be causing a problem with the parallelization under PETSc?
> > >
> > > Thanks for the help and sorry for the long question.
> > >
> > > Best regards,
> > > John
> > > <main.cpp>
> >
> >
>


More information about the petsc-users mailing list