[petsc-users] Issue with KSPSolve when the number of CPUs are larger than four
Mohammad Gohardoust
gohardoust at gmail.com
Wed Jan 17 14:31:07 CST 2018
Hi Matt,
Thanks for your response. You were right: the right hand side matrix has
had some 'nan's which caused the solver to return error. Strangely this
does not happen when #cpus are less than 4! I am going to search for the
source of that.
Best,
Mohammad
On Wed, Jan 10, 2018 at 5:04 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Wed, Jan 10, 2018 at 4:41 PM, Mohammad R. Gohardoust <
> gohardoust at email.arizona.edu> wrote:
>
>> Hi,
>>
>> I hope you are doing well. I am Mohammad, a PhD student of environmental
>> sciences at the University of Arizona.
>>
>> I do appreciate any help that can shed some light on the issue I have:
>> recently I have added a feature to an existing parallel code called
>> 'parswms' which solves water and solute transport in soils. The code uses
>> MPI and parmetis for the parallelization purposes and petsc package
>> (KSPSolve) for solving linear systems. I have it installed on the UofA
>> HPC : the issue is that it works well if the number of CPUs are up to 4!
>> but when I add to this number the numerical linear solver (here it is
>> 'KSPCGS') stops with the error of 'KSP_DIVERGED_NANORINF'.
>>
>> Would you mind please giving me some hints, suggestions or resources in
>> this regard?
>>
>
> It sounds like you might have a problem in matrix assembly. However, first
> use -pc_type gmres instead since
> it has nicer numerical behavior (I believe).
>
> Thanks,
>
> Matt
>
>
>> Best Regards,
>> Mohammad
>>
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180117/1fef9acd/attachment.html>
More information about the petsc-users
mailing list