[petsc-users] general problem with application based on PETSc

yan chang yan.chang.suse at gmail.com
Sun Jan 19 07:07:59 CST 2014


Dear Karli,
Thanks for your suggestions!
Right now I test  'src/ksp/ksp/examples/tutorials/ex11.c'  with n=2
I rewrite the sample into a c++ class, and run it as mpirun -np 2  .....
I notice that the vector x is stored in 2 processors.
But the matrix A seems not as you told me 'MatView should report your
matrix R as being equally split over the MPI ranks.'

Norm of error 1.19381e-16 iterations 4
Vector Object: 2 MPI processes
  type: mpi
Process [0]
4.16334e-17
0 + 1.38778e-17 i
Process [1]
0
1.11022e-16
Matrix Object: 1 MPI processes
  type: mpiaij
row 0: (0, -7.11111 + 0.0800036 i) (1, -1)  (2, -1)
row 1: (0, -1)  (1, -7.11111 + 0.00111359 i) (3, -1)
row 2: (0, -1)  (2, -7.11111 + 0.0601271 i) (3, -1)
row 3: (1, -1)  (2, -1)  (3, -7.11111 + 0.0943018 i)

I thought Matrix Object should have 2 MPI processes.

And again thanks for your help.

Best regards,
Yan




On Sun, Jan 19, 2014 at 6:34 PM, Karl Rupp <rupp at mcs.anl.gov> wrote:

> Hi,
>
>
> On 01/19/2014 10:39 AM, yan chang wrote:
>
>> I have used Eigen (a C++ template library ) to set up a spin dynamics
>> simulation program, and now I would like to speed up the simulation in
>> parallel using PETSc, mostly use sparse Mat and Vec operations. After
>> some rewrite work (just replace the matrix operations by PETSc), I can
>> firstly set up a spin system and calculate its relaxation matrix R.
>>
>> PetscErrorCode ierr;
>> PetscInitialize(&argc, &argv, (char*) 0, help);
>>    SpinSystem sys;
>>    sys.set_magnet(14.1);
>>    sys.set_isotopes("1H 13C");
>>    sys.set_basis("sphten_liouv");
>>    sys.set_zeeman_eigs("(7 15 -22) (11 18 -29)");
>>    sys.set_zeeman_euler("(1.0472 0.7854 0.6283) (0.5236 0.4488 0.3927)");
>>    sys.set_coordinates("1 (0 0 0) 2 (0 0 1.02)");
>>    sys.create();
>>    Relaxation relax(sys);
>>    Mat R = relax.Rmat();
>>    ierr = MatView(R, PETSC_VIEWER_STDOUT_WORLD);
>>    ierr = MatDestroy(&R);
>>    CHKERRQ(ierr);
>>    ierr = PetscFinalize();
>>
>> And run it with mpirun -np 1 ./Release/spin-scenarios-petsc it is ok.
>>
>
> Please don't forget to check the error code through CHKERRQ after *each*
> function call.
>
>
>
>  However if I set the np to be 2 or more, the program would generate the
>> same number of the spin system object and it is time consuming to get
>> the result. I am very confused that in the implementation the Mat and
>> Vec variables are not sequential like Eigen, so could you give me some
>> advise how to organize my program based on PETSc?
>>
>
> a) Did you verify that you are using the correct 'mpirun'? You need to use
> the one you specified for the PETSc installation. Otherwise, the same
> executable will be called multiple times in sequential mode. Hint: MatView
> should report your matrix R as being equally split over the MPI ranks.
>
> b) To get reasonable performance and scalability you need to consider the
> multiple MPI ranks in your SpinSystem and Relaxation objects already. Each
> MPI rank should then set up its part of the global system. Have a look at
> e.g. the examples in src/ksp/ksp/examples/tutorials/ or
> src/snes/examples/tutorials to get an idea how this is done. Keep in mind
> that the distributed matrix is decomposed by rows, i.e. each MPI rank
> stores a horizontal strip of the full matrix R, but PETSc will take care of
> the details if you set values in rows which are not locally owned.
>
>
>
>  I am a beginner of both OPENMPI and PETSc. Also, after the simple set up
>> part, there will be some more time consuming work to do, that's why I
>> switch from Eigen to PETSc.
>> Thanks a lot!
>>
>
> My advice is to familiarize yourself with some of the simpler tutorials on
> how simpler systems are set up in parallel. Then, incrementally migrate
> these techniques over to your use case.
>
> Best regards,
> Karli
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140119/2aed923c/attachment.html>


More information about the petsc-users mailing list