using MUMPS with mg

Barry Smith bsmith at mcs.anl.gov
Mon Jun 23 12:45:14 CDT 2008


> ierr = MatCreate(mlmat->comm->USR_comm,&A);CHKERRQ(ierr);
> ierr = MatSetSizes(A,m,n,PETSC_DECIDE,PETSC_DECIDE);CHKERRQ(ierr);
> ierr = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);
>
>
>> Or MatCreateMPIAIJ()? Do you set the matrix prefix to -mg_coarse?
>> It is not set automatically. You need to either call  
>> MatSetFromOptions()
>> on the coarse matrix or directly call MatSetType() with aijmumps to  
>> set
>> it the type.
>
> i will check if just adding
>
> ierr = MatConvert(A, MATAIJMUMPS, MAT_REUSE_MATRIX,&A);CHKERRQ(ierr);
>
> will do the trick

   Better just to try
err = MatSetType(A,MATAIJMUMPS);CHKERRQ(ierr);
instead of
err = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);

    Barry

Thankfully selecting direct solvers will be much cleaner in the next  
release.


On Jun 23, 2008, at 10:14 AM, Thomas Geenen wrote:

> On Monday 23 June 2008 16:08, Barry Smith wrote:
>> On Jun 23, 2008, at 4:52 AM, Thomas Geenen wrote:
>>> dear Petsc users,
>>>
>>> when using mg with Petsc (ml to be precise)
>>> the default solver on the coarsest level is "redundant"
>>> (I could not find much info in either the usermanual or de online
>>> manuals about this solver)
>>>
>>> it seems to be a sequential LU factorization running on all cpu's at
>>> the same time?
>>> doing the same thing?
>>
>>    Correct.
>>
>>> i try to use MUMPS instead.
>>> i try to invoke mumps by calling
>>> ierr = PetscOptionsSetValue("-mg_coarse_ksp_type", "preonly");
>>> ierr = PetscOptionsSetValue("-mg_coarse_mat_type", "aijmumps");
>>> ierr = PetscOptionsSetValue("-mg_coarse_pc_type", "lu");
>>>
>>> this gives me
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type!
>>> [0]PETSC ERROR: Matrix type mpiaij  symbolic
>>>
>>> so apparently Petsc did not convert the matrix on the coarsest  
>>> level.
>>> if i remove the pc_type the conversion option is ignored and Petsc
>>> uses redundant.
>>
>>   How are you generating the coarse grid matrix? Do you get it with
>> DAGetMatrix()?
>
> its handled in ml.c
>
> from ml.c
>  } else { /* convert ML P and R into shell format, ML A into mpiaij  
> format */
>    for (mllevel=1; mllevel<Nlevels; mllevel++){
>      mlmat  = &(ml_object->Pmat[mllevel]);
>      ierr =  
> MatWrapML_SHELL(mlmat,reuse,&gridctx[level].P);CHKERRQ(ierr);
>      mlmat  = &(ml_object->Rmat[mllevel-1]);
>      ierr =  
> MatWrapML_SHELL(mlmat,reuse,&gridctx[level].R);CHKERRQ(ierr);
>
>      mlmat  = &(ml_object->Amat[mllevel]);
>      if (reuse){
>        ierr = MatDestroy(gridctx[level].A);CHKERRQ(ierr);
>      }
>      ierr = MatWrapML_MPIAIJ(mlmat,&gridctx[level].A);CHKERRQ(ierr);
>      level--;
>    }
>  }
>
> from  MatWrapML_MPIAIJ(
>
> ierr = MatCreate(mlmat->comm->USR_comm,&A);CHKERRQ(ierr);
> ierr = MatSetSizes(A,m,n,PETSC_DECIDE,PETSC_DECIDE);CHKERRQ(ierr);
> ierr = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);
>
>
>> Or MatCreateMPIAIJ()? Do you set the matrix prefix to -mg_coarse?
>> It is not set automatically. You need to either call  
>> MatSetFromOptions()
>> on the coarse matrix or directly call MatSetType() with aijmumps to  
>> set
>> it the type.
>
> i will check if just adding
>
> ierr = MatConvert(A, MATAIJMUMPS, MAT_REUSE_MATRIX,&A);CHKERRQ(ierr);
>
> will do the trick
>
>>
>>    Barry
>>
>>> cheers
>>> Thomas
>




More information about the petsc-users mailing list