[petsc-users] PETSC ERROR: Can only handle MPIU_REAL or MPIU_COMPLEX with ML preconditioner
Chris Richardson
chris at bpi.cam.ac.uk
Wed Oct 1 07:03:11 CDT 2014
On 01/10/2014 11:33, Matthew Knepley wrote:
> On Wed, Oct 1, 2014 at 3:51 AM, Chris Richardson <chris at bpi.cam.ac.uk>
> wrote:
>
>> I have just run into this error after pulling the master branch of
>> petsc, when using the ML preconditioner in FEniCS.
>> It looks like a type error, but I guess FEniCS uses PETScInt, so
>> should be safe?
>>
>> PETSC ERROR: Can only handle MPIU_REAL or MPIU_COMPLEX data types
>
> We need to see the entire error (with the stack).
>
OK, this kind of thing:-
[1]PETSC ERROR: Can only handle MPIU_REAL or MPIU_COMPLEX data types
^C
Program received signal SIGINT, Interrupt.
0x00007ffff64d9653 in epoll_wait () at
../sysdeps/unix/syscall-template.S:81
81 ../sysdeps/unix/syscall-template.S: No such file or directory.
(gdb) up
#1 0x00007ffff72c8591 in ?? () from /usr/lib/libmpi.so.1
(gdb) up
#2 0x00007ffff72ca787 in opal_event_base_loop () from
/usr/lib/libmpi.so.1
(gdb) up
#3 0x00007ffff72eda3e in opal_progress () from /usr/lib/libmpi.so.1
(gdb) up
#4 0x00007fffb42884d5 in ?? ()
from /usr/lib/openmpi/lib/openmpi/mca_grpcomm_bad.so
(gdb) up
#5 0x00007ffff723d02a in ompi_mpi_finalize () from /usr/lib/libmpi.so.1
(gdb) up
#6 0x00007ffff7766b12 in dolfin::SubSystemsManager::finalize_mpi ()
at /opt/packages/src/dolfin/dolfin/common/SubSystemsManager.cpp:249
249 MPI_Finalize();
(gdb) up
#7 0x00007ffff7766cde in dolfin::SubSystemsManager::finalize ()
at /opt/packages/src/dolfin/dolfin/common/SubSystemsManager.cpp:219
219 finalize_mpi();
(gdb) up
#8 0x00007ffff641a149 in __run_exit_handlers (status=1,
listp=0x7ffff679d6c8 <__exit_funcs>,
run_list_atexit=run_list_atexit at entry=true) at exit.c:82
82 exit.c: No such file or directory.
(gdb) up
#9 0x00007ffff641a195 in __GI_exit (status=<optimised out>) at
exit.c:104
104 in exit.c
(gdb) up
#10 0x00007ffff729cc00 in orte_ess_base_app_abort () from
/usr/lib/libmpi.so.1
(gdb) up
#11 0x00007ffff729c2a9 in orte_errmgr_base_error_abort ()
from /usr/lib/libmpi.so.1
(gdb) up
#12 0x00007ffff723bb69 in ompi_mpi_abort () from /usr/lib/libmpi.so.1
(gdb) up
#13 0x00007ffff37b9ee3 in PetscSum_Local (in=0x230f4520,
out=0x7fffffffc650,
cnt=<optimised out>, datatype=0x7fffffffc428)
at /opt/packages/src/petsc/src/sys/objects/pinit.c:302
302 MPI_Abort(MPI_COMM_WORLD,1);
(gdb) up
#14 0x00007fffb10dd4dc in mca_coll_basic_scan_intra ()
from /usr/lib/openmpi/lib/openmpi/mca_coll_basic.so
(gdb) up
#15 0x00007fffb08c7e19 in mca_coll_sync_scan ()
from /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so
(gdb) up
#16 0x00007ffff725835d in PMPI_Scan () from /usr/lib/libmpi.so.1
(gdb) up
#17 0x00007ffff413ec45 in MatWrapML_MPIAIJ (newmat=0x16a39228,
reuse=MAT_INITIAL_MATRIX, mlmat=0x7ed6090)
at /opt/packages/src/petsc/src/ksp/pc/impls/ml/ml.c:422
422 ierr =
MPI_Scan(&m,&rstart,1,MPIU_INT,MPIU_SUM,mlmat->comm->USR_com);CHKERRQ(ierr);
(gdb) list
417 ierr =
MatSetSizes(A,m,n,PETSC_DECIDE,PETSC_DECIDE);CHKERRQ(ierr);
418 ierr = MatSetType(A,MATMPIAIJ);CHKERRQ(ierr);
419 /* keep track of block size for A matrices */
420 ierr = MatSetBlockSize (A,mlmat->num_PDEs);CHKERRQ(ierr);
421 ierr = PetscMalloc3(m,&nnzA,m,&nnzB,m,&nnz);CHKERRQ(ierr);
422 ierr =
MPI_Scan(&m,&rstart,1,MPIU_INT,MPIU_SUM,mlmat->comm->USR_com);CHKERRQ(ierr);
423 rstart -= m;
424
425 for (i=0; i<m; i++) {
426 row = gordering[i] - rstart;
(gdb)
More information about the petsc-users
mailing list