[petsc-users] parallel PCASM

Reza Madankan rm93 at buffalo.edu
Wed Dec 21 22:10:28 CST 2011


Thanks a lot Matthew. That was a great help.
It seems I was completely out of the track. I am looking at what you said,
and made good progress til here...

Thanks again
Reza


On Wed, Dec 21, 2011 at 4:40 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Wed, Dec 21, 2011 at 3:30 PM, Reza Madankan <rm93 at buffalo.edu> wrote:
>
>> Hi Matthew;
>> Thank you for you quick reply.
>> Sorry for the wrong terminology, it's just few days I started working
>> with Petsc.
>> The whole story is that I have a linear system Ax = b which I want to
>> solve it using different preconditioners just to see the effect of each of
>> those on the solution and residual error. I have already used PCJACOBI and
>> PCILU to find the solution. I also would like to see the effect of using
>>  PCASM. PCASM on a single processor performs like PCILU, but its result
>> will be different from PCILU if I implement it on multiple processors. So,
>> I am trying to get it run for multiple processors case using the following
>> command:
>>
>> mpiexec -np 2 ./pcasm
>>
>> but I get the error message that I showed before. Now my question is that
>> how I can implement PCASM on multiple processors?
>>
>
> It is not ASM that is the problem. You have  a bug in your code. Use
> valgrind to find it.
>
> The larger problem here is that you have completely misunderstood the
> point of PETSc.
> Whatever you do, never change your code to use a different solver. You
> should play
> with the PETSc examples for a while
>
>   cd src/snes/examples/tutorials
>   make ex5
>   ./ex5 -snes_monitor -ksp_monitor -snes_view
>   $MPIEXEC -n 2 ./ex5 -snes_monitor -ksp_monitor -snes_view
>   $MPIEXEC -n 2 ./ex5 -snes_monitor -ksp_monitor -snes_view -pc_type jacobi
>   $MPIEXEC -n 2 ./ex5 -snes_monitor -ksp_monitor -snes_view -pc_type asm
>
>     Matt
>
> Thanks again
>> Reza
>>
>>
>>
>> On Wed, Dec 21, 2011 at 3:19 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>
>>> On Wed, Dec 21, 2011 at 2:12 PM, Reza Madankan <rm93 at buffalo.edu> wrote:
>>>
>>>> Hello,
>>>> I am trying to parallelize the ASM preconditioner while solving a
>>>> linear system. For more detail, I have printed relevant parts of code in
>>>> the following:
>>>>
>>>
>>> I have no idea what "parallelize ASM" might mean.
>>>
>>> Below you likely have a bad value in b, the rhs you pass to the solver
>>> on line 487 of your code.
>>>
>>>   Matt
>>>
>>>
>>>> *static char help[]="Reading in a matrix\n";*
>>>> *
>>>> *
>>>> * #include<stdio.h>*
>>>> *
>>>> *
>>>> * #include<math.h>*
>>>> *
>>>> *
>>>> * #include<stdlib.h>*
>>>> *
>>>> *
>>>> * #include "petscvec.h" /* This enables us to use vectors. */*
>>>> *
>>>> *
>>>> * #include "petscmat.h" /* This enables us to use Matrices. It
>>>> includes the*
>>>> * petscvec header file*/*
>>>> *
>>>> *
>>>> * #include "petscksp.h" /* Now we can solve linear systems. Solvers
>>>> used are*
>>>> * KSP. */*
>>>> *
>>>> *
>>>> * extern PetscErrorCode MyKSPMonitor(KSP,PetscInt,PetscReal,void*);*
>>>> *
>>>> *
>>>> * int main(int argc, char **argv)*
>>>> *
>>>> *
>>>> * {*
>>>> *
>>>> *
>>>> * /* Declaration of Matrix A and some vectors x*/*
>>>> *
>>>> *
>>>> * Vec  w, ypcq, mean_tmp, x, PzyTvec;*
>>>> * Mat Ym, Y, Mean, Ypcq, YpcqT, Pzz, InnProd, Para, Pzy, PzyT,PzzT,K,
>>>> KT, PK, PzzTMP, Pmat, XYmat, KY;*
>>>> *
>>>> *
>>>> * FILE *fp, *fp1, *fp2, *fp3;*
>>>> *
>>>> *
>>>> * PetscInt index,i,j,k, ns=16, nw=100, tindex_f=120, col,tind, mz, nz,
>>>> its;*
>>>> */* *
>>>> *printf("Enter number of sensors (ns):\n");*
>>>> *scanf("%d",&ns);*
>>>> *
>>>> *
>>>> *printf("Enter number of time steps (tindex_f):\n");*
>>>> *scanf("%d",&tindex_f);*
>>>> **/*
>>>> * PetscMPIInt size;*
>>>> *
>>>> *
>>>> * PC pc;*
>>>> * KSP ksp;*
>>>> * PetscReal norm, tol=1.e-14;*
>>>> * PetscBool nonzeroguess = PETSC_FALSE;*
>>>> * PetscViewer mviewer;*
>>>> *
>>>> *
>>>> *
>>>> *
>>>> * PetscScalar scalar,rhsy[ns*tindex_f*nw],rhsym[ns*tindex_f],
>>>> rhsw[nw], Ymat[tindex_f*ns][nw], tmp, rhsp[2*nw], xvec[ns*tindex_f],
>>>> pmat[4], xymat[2], rnorm;*
>>>> *
>>>> *
>>>> * PetscErrorCode ierr;*
>>>> *
>>>> *
>>>> * /* This part is needed for the help flag supplied at run-time*/*
>>>> *
>>>> *
>>>> * ierr = PetscInitialize(&argc,&argv,(char*)0,help);CHKERRQ(ierr);*
>>>> * *
>>>> * ierr = MPI_Comm_size(PETSC_COMM_WORLD, &size);CHKERRQ(ierr);*
>>>> * //if (size != 1) SETERRQ(PETSC_COMM_WORLD,1,"This is a uniprocessor
>>>> example");*
>>>> *PetscPrintf(PETSC_COMM_WORLD,"size is: %D\n",size);*
>>>> *int nstindex_f=ns*tindex_f;*
>>>> *ierr =
>>>> PetscOptionsGetInt(PETSC_NULL,"-n",&nstindex_f,PETSC_NULL);CHKERRQ(ierr);
>>>> *
>>>> *
>>>> *
>>>> *.*
>>>> *.*
>>>> *.*
>>>> *
>>>> *
>>>> *
>>>> *
>>>> *// Solution for K:*
>>>> *// --------------*
>>>> *
>>>> *
>>>> *    ierr = MPI_Comm_size(PETSC_COMM_WORLD, &size);CHKERRQ(ierr);*
>>>> *    ierr =
>>>> PetscOptionsGetInt(PETSC_NULL,"-n",&nstindex_f,PETSC_NULL);CHKERRQ(ierr);
>>>> *
>>>> *    ierr =
>>>> PetscOptionsGetBool(PETSC_NULL,"-nonzero_guess",&nonzeroguess,PETSC_NULL);
>>>> CHKERRQ(ierr);*
>>>> *
>>>> *
>>>> *
>>>> *
>>>> *// KSPMonitorSet(ksp,MyKSPMonitor,PETSC_NULL,0);*
>>>> *
>>>> *
>>>> *
>>>> *
>>>> *    ierr = VecCreate(PETSC_COMM_WORLD,&x);CHKERRQ(ierr);*
>>>> *    ierr = PetscObjectSetName((PetscObject) x,
>>>> "Solution");CHKERRQ(ierr);*
>>>> *    ierr = VecSetSizes(x,PETSC_DECIDE,ns*tindex_f);CHKERRQ(ierr);*
>>>> *    ierr = VecSetFromOptions(x);CHKERRQ(ierr);*
>>>> *    ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);*
>>>> *    *
>>>> *
>>>> *
>>>> *// ierr = KSPMonitorSet(ksp,PetscInt n,PetscReal rnorm, void *); *
>>>> *// ierr = monitor(ksp,Int it,PetscReal rnorm,);*
>>>> *    *
>>>> *
>>>> *
>>>> *    ierr = MatTranspose(Pzz,MAT_INITIAL_MATRIX,&PzzT);*
>>>> *    MatAssemblyBegin(PzzT,MAT_FINAL_ASSEMBLY);*
>>>> *    MatAssemblyEnd(PzzT,MAT_FINAL_ASSEMBLY);*
>>>> *    ierr =
>>>> KSPSetOperators(ksp,PzzT,PzzT,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr);
>>>> *
>>>> *
>>>> *
>>>> *    ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr);*
>>>> *    ierr = PCSetType(pc,PCASM);CHKERRQ(ierr);*
>>>> *    ierr =
>>>> KSPSetTolerances(ksp,1e-6,PETSC_DEFAULT,PETSC_DEFAULT,1e+9);CHKERRQ(ierr);
>>>> *
>>>> *    ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);*
>>>> *    if (nonzeroguess)*
>>>> *    {*
>>>> *         PetscScalar p = 0.5;*
>>>> *         ierr = VecSet(x,p);CHKERRQ(ierr);*
>>>> *         ierr =
>>>> KSPSetInitialGuessNonzero(ksp,PETSC_TRUE);CHKERRQ(ierr);*
>>>> *    }*
>>>> *    ierr = MatTranspose(Pzy,MAT_INITIAL_MATRIX,&PzyT);*
>>>> *    MatAssemblyBegin(PzyT,MAT_FINAL_ASSEMBLY);*
>>>> *    MatAssemblyEnd(PzyT,MAT_FINAL_ASSEMBLY);*
>>>> *    ierr = MatCreate(PETSC_COMM_WORLD,&KT);CHKERRQ(ierr);*
>>>> *    ierr =
>>>> MatSetSizes(KT,PETSC_DECIDE,PETSC_DECIDE,ns*tindex_f,2);CHKERRQ(ierr);*
>>>> *    ierr = MatSetFromOptions(KT);CHKERRQ(ierr);*
>>>> *    int ctr=0;*
>>>> *    for (ctr=0; ctr<2; ctr++)*
>>>> *    {*
>>>> *    ierr = VecCreate(PETSC_COMM_WORLD,&PzyTvec);*
>>>> *    ierr = VecSetSizes(PzyTvec,PETSC_DECIDE,ns*tindex_f);*
>>>> *    ierr = VecSetFromOptions(PzyTvec);*
>>>> *    ierr = MatGetColumnVector(PzyT,PzyTvec,ctr);*
>>>> *
>>>> *
>>>> *    ierr = KSPSolve(ksp,PzyTvec,x);CHKERRQ(ierr);*
>>>> *    ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr);*
>>>> *       ierr = KSPGetResidualNorm(ksp,& norm);    *
>>>> *    printf("residual norm is=%e\n",norm);*
>>>> *      printf("iteration number is=%d\n",its);*
>>>> *    ierr = KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);*
>>>> *    *
>>>> *// printf("%d\n", ctr);*
>>>> *    for (i=0; i<ns*tindex_f; i++)*
>>>> *    {*
>>>> *    ierr = VecGetValues(x,1,&i,&xvec[i]);*
>>>> *    ierr = MatSetValues(KT,1,&i,1,&ctr,&xvec[i],INSERT_VALUES);*
>>>> *    }*
>>>> *    }*
>>>> *    MatAssemblyBegin(KT,MAT_FINAL_ASSEMBLY);*
>>>> *    MatAssemblyEnd(KT,MAT_FINAL_ASSEMBLY);*
>>>> *    MatTranspose(KT,MAT_INITIAL_MATRIX,&K);*
>>>> *    MatAssemblyBegin(K,MAT_FINAL_ASSEMBLY);*
>>>> *    MatAssemblyEnd(K,MAT_FINAL_ASSEMBLY);*
>>>> *
>>>> *
>>>> *    MatPtAP(Pzz,KT,MAT_INITIAL_MATRIX,1.0,&PK);*
>>>> *//MatCopy(Pzz,PzzTMP,SAME_NONZERO_PATTERN);*
>>>> *           ierr = MatAXPY(Pmat,-1,PK,DIFFERENT_NONZERO_PATTERN);*
>>>> *    MatView(Pmat,PETSC_VIEWER_STDOUT_WORLD);*
>>>> *
>>>> *
>>>> *           ierr = MatAXPY(Ym,-1,Mean,DIFFERENT_NONZERO_PATTERN);*
>>>> *    MatAssemblyBegin(Ym,MAT_FINAL_ASSEMBLY);*
>>>> *           MatAssemblyEnd(Ym,MAT_FINAL_ASSEMBLY);*
>>>> *
>>>> *
>>>> *    ierr = MatMatMult(K,Ym,MAT_INITIAL_MATRIX,PETSC_DEFAULT,&KY);*
>>>> *    ierr = MatAssemblyBegin(KY,MAT_FINAL_ASSEMBLY);*
>>>> *    ierr = MatAssemblyEnd(KY,MAT_FINAL_ASSEMBLY);*
>>>> *       ierr = MatAXPY(XYmat,1,KY,DIFFERENT_NONZERO_PATTERN);*
>>>> *    MatView(XYmat,PETSC_VIEWER_STDOUT_WORLD);*
>>>> *
>>>> *
>>>> *ierr = MatDestroy(&Mean);*
>>>> *
>>>> *
>>>> *ierr=PetscFinalize();CHKERRQ(ierr);*
>>>> *
>>>> *
>>>> * return 0;*
>>>> *
>>>> *
>>>> * }*
>>>> *
>>>> *
>>>>
>>>>
>>>> I am using the following commands to compile and run the code:
>>>>
>>>> *mpicc pcasm.c -o pcasm -l petsc*
>>>> *mpiexec -np 2 ./pcasm*
>>>>
>>>> But when I run the code I get the following error message which I don't
>>>> know what it means.
>>>>
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message
>>>> ------------------------------------
>>>> [0]PETSC ERROR: Floating point exception!
>>>> [0]PETSC ERROR: Infinite or not-a-number generated in norm!
>>>> [0]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 3, Fri Sep 30
>>>> 10:28:33 CDT 2011
>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>> [0]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [0]PETSC ERROR: ./pcasm on a linux-imp named k07n14.ccr.buffalo.edu by
>>>> rm93 Wed Dec 21 15:05:33 2011
>>>> [0]PETSC ERROR: Libraries linked from
>>>> /util/petsc/petsc-3.2-p3/linux-impi-mkl/lib
>>>> [0]PETSC ERROR: Configure run at Fri Oct 21 08:36:23 2011
>>>> [0]PETSC ERROR: Configure options --CC=/util/intel/impi/
>>>> 4.0.3.008/intel64/bin/mpiicc --FC=/util/intel/impi/
>>>> 4.0.3.008/intel64/bin/mpiifort --CXX=/util/intel/impi/
>>>> 4.0.3.008/intel64/bin/mpiicpc--with-blas-lapack-dir=/util/intel/composer_xe_2011_sp1/mkl/lib/intel64
>>>> --download-hypre=1 --with-debugging=0 -PETSC_ARCH=linux-impi-mkl
>>>> --with-shared-libraries=1
>>>> [0]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [0]PETSC ERROR: VecNorm() line 167 in src/vec/vec/interface/rvector.c
>>>> [0]PETSC ERROR: VecNormalize() line 261 in
>>>> src/vec/vec/interface/rvector.c
>>>> [0]PETSC ERROR: GMREScycle() line 128 in src/ksp/ksp/impls/gmres/gmres.c
>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 231 in
>>>> src/ksp/ksp/impls/gmres/gmres.c
>>>> [0]PETSC ERROR: KSPSolve() line 423 in src/ksp/ksp/interface/itfunc.c
>>>> [0]PETSC ERROR: main() line 487 in "unknowndirectory/"pcasm.c
>>>> application called MPI_Abort(MPI_COMM_WORLD, 72) - process 0
>>>> rank 0 in job 2  k07n14.ccr.buffalo.edu_51735   caused collective abort
>>>> of all ranks
>>>>   exit status of rank 0: return code 72
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Could anyone help me on this problem? Thanks in advance for any help.
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111221/46e5603a/attachment-0001.htm>


More information about the petsc-users mailing list