[petsc-users] MatView and PetscViewerASCIIOpen problem when cpu > 1
Barry Smith
bsmith at mcs.anl.gov
Mon Jun 25 21:46:50 CDT 2012
It prints it in the natural ordering and does not print the division of the matrix into its two parts for each process.
Barry
On Jun 25, 2012, at 9:44 PM, TAY wee-beng wrote:
> On 25/6/2012 5:57 PM, Barry Smith wrote:
>> So you run with -mat_ascii_output_large and it hangs? How big is the matrix?
>>
>> If it truly hangs and doesn't just take a long time you can run with -start_in_debugger and then hit control c in the debugger after it has been "hanging" to see where it is.
>>
>> But why not just run with a matrix less than 1024 to check that the matrix is correct.
>>
>> Barry
>
> I have switched to a 2d case. The matrix size is much smaller (still more than 1024). The matrix is now saved to a txt file. However, when running 2 cpu, I got:
>
> Matrix Object: 1 MPI processes
> type: mpiaij
> row 0: (0, -2) (132, 3) (264, -1)
> row 1: (1, -2) (133, 3) (265, -1)
> row 2: (2, -2) (134, 3) (266, -1)
> ...
>
> Shouldn't it be 2 MPI processes? Like in a vector:
>
> Vector Object:Vec_84000004_0 2 MPI processes
> type: mpi
> Process [0]
> 0
> 0
> ...
>
> I used :
>
> call PetscViewerASCIIOpen(MPI_COMM_WORLD,'A_semi.txt',viewer,ierr)
>
> call MatView(A_semi,viewer,ierr)
>
> call PetscViewerDestroy(viewer,ierr)
>
> Is there some error in my matrix? I also noticed that the matrix creating w/o using DM gives the same "1 MPI processes" output.
>
> I create it using :
>
> call DMDACreate2d(MPI_COMM_WORLD,DMDA_BOUNDARY_NONE,DMDA_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> PETSC_DECIDE,num_procs,i1,i1,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,ierr)
>
> call DMCreateMatrix(da,MATAIJ,A_semi,ierr)
>
> Thanks!
>
>
>>
>> On Jun 25, 2012, at 4:39 PM, TAY wee-beng wrote:
>>
>>> On 25/6/2012 4:24 PM, Barry Smith wrote:
>>>> On Jun 25, 2012, at 4:18 PM, TAY wee-beng wrote:
>>>>
>>>>> On 25/6/2012 3:40 PM, Jed Brown wrote:
>>>>>> On Mon, Jun 25, 2012 at 12:25 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>>>>>> Hi,
>>>>>>
>>>>>> I'm trying to use DMDA to solve my linear equation.
>>>>>>
>>>>>> It works fine with 1 cpu. However, problem arise when cpu > 1. I get segmentation error.
>>>>>>
>>>>>> I tried to use MatView and PetscViewerASCIIOpen to view my matrix to check if it's correct.
>>>> It only makes sense to view the matrix in this case for small matrices. You should check the matrix for a very small problem, how are you going to compare the ASCII output to see if it is correct for a huge matrix. Always debug with very small data sets.
>>>>
>>>> Barry
>>> I just want to get a quick view of the matrix. I have the correct matrix generated without using DM. So, I use a file difference comparison software to check the difference, which is rather fast. I just wonder why it takes a long time to view the matrix on screen, or write to file. I guess it should be easy to just view the matrix and spot the error.
>>>>>> It works with 1 cpu. However now, it shows the error below when I use 2 cpu.
>>>>>>
>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
>>>>>> [0]PETSC ERROR: Argument out of range!
>>>>>> [0]PETSC ERROR: ASCII matrix output not allowed for matrices with more than 1024 rows, use binary format instead.
>>>>>> You can override this restriction using -mat_ascii_output_large.!
>>>>>>
>>>>>> To view the matrix, my command is :
>>>>>>
>>>>>> call PetscViewerASCIIOpen(MPI_COMM_WORLD,"A_semi.txt",viewer,ierr)
>>>>>>
>>>>>> call MatView(A_semi,viewer,ierr)
>>>>>>
>>>>>> call PetscViewerDestroy(viewer,ierr)
>>>>>>
>>>>>> I tried to use -mat_ascii_output_large but then it just hangs there forever.
>>>>>>
>>>>>> It may just be very slow.
>>>>> I tried using 1 cpu and I got it in a few minutes. Should i take a long time if I use 2 cpu? Also, I got the error:
>>>>>
>>>>> [0]PETSC ERROR: Argument out of range!
>>>>> [0]PETSC ERROR: ASCII matrix output not allowed for matrices with more than 1024 rows, use binary format instead.
>>>>>
>>>>> It doesn't happen when I use 1 cpu. Is it normal to get it when I use 2 cpu?
>>>>>
>>>>> I also tried to use :
>>>>>
>>>>> call MatView(A_semi,PETSC_VIEWER_STDOUT_WORLD ,ierr)
>>>>>
>>>>> but the same error occurs. It just hangs there when 2 cpu are used.
>>>>>
>>>>>
>>>>>> Why do you need ASCII? Writing anything large in ASCII is hopeless. Please use binary.
>>>>>> I create the matrix using DMCreateMatrix. I don't have to use MatCreateAIJ, right?
>>>>>>
>>>>>> Also, I check for the ierr value when I use MatSetValuesStencil to ensure there's no error. Btw There's no problem with VecView.
>>>>>>
>>>>>> So how can I view the matrix?
>>>>>>
>>>>>> Thank you!
>>>>>>
>>>>>> --
>>>>>> Yours sincerely,
>>>>>>
>>>>>> TAY wee-beng
>>>>>>
>>>>>>
>>>
>
>
More information about the petsc-users
mailing list