From nek5000-users at lists.mcs.anl.gov Mon Oct 3 10:47:51 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 3 Oct 2011 17:47:51 +0200 Subject: [Nek5000-users] Passive scalar equation Message-ID: Dear all, I was having a look into the code to see how the terms of the passive scalar equation are computed. As far I understand, for iftran=true and ifcvode=true, the term "divergence*(lambda gradient(T))" is computed into the subroutine "makeq" by calling the subroutine "wlaplacian". The subroutine "wlaplacian" calls the subroutine "axhelm". It seems to me that the subroutine "axhelm" computes "lambda(laplacian T)" instead of "divergence*(lambda gradient(T))". Of course the two expressions are the same for a constant lambda, which is not my case. Why is it like this? The subroutine "axhelm" is quite complex, therefore I am not sure that I understood it correctly. Thanks in advance for your answer. Regards, Andreas. From nek5000-users at lists.mcs.anl.gov Mon Oct 3 11:21:25 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 3 Oct 2011 18:21:25 +0200 Subject: [Nek5000-users] Passive scalar equation In-Reply-To: References: Message-ID: Hi Andreas, Unless you have a good reason (highly coupled and/or stiff scalar equation system) you don't need to use CVODE to solve a passive scalar equation with Nek. Yes, the discrete weak diffusion term (multiplied by -1) is computed within axhelm (helm1). The implementation does not assume that lambda is constant. Just work out the weak formulation of the diffusion term and you'll see what I mean. Cheers, Stefan On 10/3/11, nek5000-users at lists.mcs.anl.gov wrote: > Dear all, > > I was having a look into the code to see how the terms of the passive scalar > equation are computed. > > As far I understand, for iftran=true and ifcvode=true, the term > "divergence*(lambda gradient(T))" is computed into the subroutine "makeq" by > calling the subroutine "wlaplacian". The subroutine "wlaplacian" calls the > subroutine "axhelm". It seems to me that the subroutine "axhelm" computes > "lambda(laplacian T)" instead of "divergence*(lambda gradient(T))". Of > course the two expressions are the same for a constant lambda, which is not > my case. Why is it like this? > The subroutine "axhelm" is quite complex, therefore I am not sure that I > understood it correctly. > > Thanks in advance for your answer. > > Regards, > Andreas. > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Mon Oct 3 11:26:19 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 3 Oct 2011 16:26:19 +0000 Subject: [Nek5000-users] Passive scalar equation In-Reply-To: References: , Message-ID: Thanks a lot Stefan. Cheers, Andreas. ________________________________________ From: nek5000-users-bounces at lists.mcs.anl.gov [nek5000-users-bounces at lists.mcs.anl.gov] on behalf of nek5000-users at lists.mcs.anl.gov [nek5000-users at lists.mcs.anl.gov] Sent: Monday, October 03, 2011 6:21 PM To: nek5000-users at lists.mcs.anl.gov Subject: Re: [Nek5000-users] Passive scalar equation Hi Andreas, Unless you have a good reason (highly coupled and/or stiff scalar equation system) you don't need to use CVODE to solve a passive scalar equation with Nek. Yes, the discrete weak diffusion term (multiplied by -1) is computed within axhelm (helm1). The implementation does not assume that lambda is constant. Just work out the weak formulation of the diffusion term and you'll see what I mean. Cheers, Stefan On 10/3/11, nek5000-users at lists.mcs.anl.gov wrote: > Dear all, > > I was having a look into the code to see how the terms of the passive scalar > equation are computed. > > As far I understand, for iftran=true and ifcvode=true, the term > "divergence*(lambda gradient(T))" is computed into the subroutine "makeq" by > calling the subroutine "wlaplacian". The subroutine "wlaplacian" calls the > subroutine "axhelm". It seems to me that the subroutine "axhelm" computes > "lambda(laplacian T)" instead of "divergence*(lambda gradient(T))". Of > course the two expressions are the same for a constant lambda, which is not > my case. Why is it like this? > The subroutine "axhelm" is quite complex, therefore I am not sure that I > understood it correctly. > > Thanks in advance for your answer. > > Regards, > Andreas. > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 4 18:08:03 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 4 Oct 2011 18:08:03 -0500 Subject: [Nek5000-users] Modifying an initial field Message-ID: Hi neks, I am loading an initial condition from a .fld file, but I would like to modify the velocity field before using it as my initial condition. (I would like to superpose a shear onto the existing field.) Can someone tell me what part of the .usr file this should be done in? Best, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 4 18:21:29 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 4 Oct 2011 16:21:29 -0700 Subject: [Nek5000-users] Modifying an initial field Message-ID: <613558249797508601@unknownmsgid> Hi David, The best place would probably be in userchk. You could have a block like if (istep.eq.0) then C........do stuff endif If you are doing something else in userchk that would depend on your modification of the field quantities, then you would want this to come first. Just in case, the velocity fields are accessed like vx(i,j,k,e) ( for u) and likewise for vy and vz, where i,j,k go from 1 to nx1 (ny1, nz1) and e goes from 1 to nelv. Sorry if that was more information than you needed, just wanted to make sure all my bases were covered :-). Does this help? Josh Camp "All that is necessary for the triumph of evil is that good men do nothing" -- Edmund Burke ------------------------------ From: nek5000-users at lists.mcs.anl.gov Sent: 10/4/2011 6:08 PM To: nek5000-users at lists.mcs.anl.gov Subject: [Nek5000-users] Modifying an initial field Hi neks, I am loading an initial condition from a .fld file, but I would like to modify the velocity field before using it as my initial condition. (I would like to superpose a shear onto the existing field.) Can someone tell me what part of the .usr file this should be done in? Best, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Wed Oct 5 15:20:01 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Wed, 5 Oct 2011 15:20:01 -0500 Subject: [Nek5000-users] Modifying an initial field In-Reply-To: <613558249797508601@unknownmsgid> References: <613558249797508601@unknownmsgid> Message-ID: Hi Josh, Yes, changing the initial field in the first call of userchk worked, thanks. David On Tue, Oct 4, 2011 at 6:21 PM, wrote: > Hi David, > > The best place would probably be in userchk. > > You could have a block like > > if (istep.eq.0) then > > C........do stuff > > endif > > If you are doing something else in userchk that would depend on your > modification of the field quantities, then you would want this to come > first. > > Just in case, the velocity fields are accessed like > > vx(i,j,k,e) ( for u) > > and likewise for vy and vz, where i,j,k go from 1 to nx1 (ny1, nz1) and e > goes from 1 to nelv. Sorry if that was more information than you needed, > just wanted to make sure all my bases were covered :-). > > Does this help? > > Josh Camp > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Thu Oct 6 13:58:29 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 06 Oct 2011 14:58:29 -0400 Subject: [Nek5000-users] Pn-Pn boundary condition Message-ID: <4E8DFA55.1020006@vt.edu> Hi all, I am wondering if there is any reference on the pressure boundary condition for Pn-Pn case? I know Dr. Fisher has a 1997 JCP paper discussing the Pn-Pn-2 setting. Also what subroutines/functions handle with the boundary conditions for both velocity and pressure? Thank you, Zhu Wang Virginia Tech ---------------------------- Dear Fred, The pressure bcs are chosen to ensure div.U=0. For the Pn-Pn case, this involves addition of an appropriate inhomogeneity that cancels the b.l. divergence that arises from std. splitting formulations. For the Pn-Pn-2 case, there are no pressre bcs, only velocity bcs. Paul ---------------------------------- On Thu, 24 Jun 2010, nek5000-users at lists.mcs.anl.gov wrote: > Dear NEKs, > > I wanted to ask which boundary condition is used for the pressure in nek?? > are all pressure gradients normal to wall set tto zero?? because in my > current simulation i have a pressure gradient normal to a wall and i am > wondering why!? > > best, fred --------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Thu Oct 6 14:45:51 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 06 Oct 2011 15:45:51 -0400 Subject: [Nek5000-users] how to implement integration like (u-x, p_x) and (p_x, p_x) in Pn-Pn-2 setting Message-ID: <4E8E056F.3030800@vt.edu> Hi NEKs, I am trying to evaluate integrations involving velocity /u/ and pressure /p/ in "userchk". But I am not sure what variables containing the right weights (mass) for integration in the Pn-Pn-2 setting. For example, for (/u_x, u_x/), I can use "bm1". However, for (/u_x, p_x/), what I shall use? so does (/p_x, p_x/)? Thank you, -- *Zhu Wang* Department of Mathematics Virginia Tech Web: http://www.math.vt.edu/people/wangzhu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Thu Oct 6 14:48:06 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 6 Oct 2011 14:48:06 -0500 (CDT) Subject: [Nek5000-users] how to implement integration like (u-x, p_x) and (p_x, p_x) in Pn-Pn-2 setting In-Reply-To: <4E8E056F.3030800@vt.edu> References: <4E8E056F.3030800@vt.edu> Message-ID: bm2 :) On Thu, 6 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > > Hi NEKs, > > I am trying to evaluate integrations involving velocity /u/ and pressure /p/ > in "userchk". But I am not sure what variables containing the right weights > (mass) for integration in the Pn-Pn-2 setting. For example, for (/u_x, u_x/), > I can use "bm1". However, for (/u_x, p_x/), what I shall use? so does (/p_x, > p_x/)? > > Thank you, > -- > *Zhu Wang* > Department of Mathematics > Virginia Tech > Web: http://www.math.vt.edu/people/wangzhu/ > > > From nek5000-users at lists.mcs.anl.gov Thu Oct 6 14:52:44 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 6 Oct 2011 14:52:44 -0500 (CDT) Subject: [Nek5000-users] how to implement integration like (u-x, p_x) and (p_x, p_x) in Pn-Pn-2 setting In-Reply-To: <4E8E056F.3030800@vt.edu> References: <4E8E056F.3030800@vt.edu> Message-ID: Sorry - for u*p, you need to do something different... I suggest map pr to pm1 (pressure on mesh 1), then: n = nx1*ny1*nz1*nelv val = glsc3(vx,bm1,pm1,n) would give you what you want. In the code, I find that you can map pr to pm1 as: c Map pressure onto mesh 1 (vxx and vyy are used as work arrays) call mappr(pm1,pr,vxx,vyy) --- Paul From nek5000-users at lists.mcs.anl.gov Thu Oct 6 15:27:35 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 6 Oct 2011 15:27:35 -0500 Subject: [Nek5000-users] Pn-Pn boundary condition In-Reply-To: <4E8DFA55.1020006@vt.edu> References: <4E8DFA55.1020006@vt.edu> Message-ID: Zhu, As far as I understand it, boundary conditions for pressure aren't exactly SET, more so as they are computed so that div u = 0 (as Paul mentions in the note you included below). So, as far as I know, there is no method to set, say, an inlet pressure with Nekton. Is this correct developers? (it's something I've always wondered myself) Josh On Thu, Oct 6, 2011 at 1:58 PM, wrote: > Hi all, > > I am wondering if there is any reference on the pressure boundary condition > for Pn-Pn case? I know Dr. Fisher has a 1997 JCP paper discussing the > Pn-Pn-2 setting. > > Also what subroutines/functions handle with the boundary conditions for both > velocity and pressure? Thank you, > > Zhu Wang > Virginia Tech > > > ---------------------------- > Dear Fred, > > The pressure bcs are chosen to ensure div.U=0. > > For the Pn-Pn case, this involves addition of an appropriate inhomogeneity > that cancels the b.l. divergence that arises from std. splitting > formulations. > > For the Pn-Pn-2 case, there are no pressre bcs, only velocity bcs. > > Paul > > ---------------------------------- > On Thu, 24 Jun 2010, nek5000-users at lists.mcs.anl.gov wrote: > >> Dear NEKs, >> >> I wanted to ask which boundary condition is used for the pressure in nek?? >> are all pressure gradients normal to wall set tto zero?? because in my >> current simulation i have a pressure gradient normal to a wall and i am >> wondering why!? >> >> best, fred > --------------------------- > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > > -- Josh Camp "All that is necessary for the triumph of evil is that good men do nothing" -- Edmund Burke From nek5000-users at lists.mcs.anl.gov Thu Oct 6 15:30:34 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 6 Oct 2011 15:30:34 -0500 (CDT) Subject: [Nek5000-users] Pn-Pn boundary condition In-Reply-To: References: <4E8DFA55.1020006@vt.edu> Message-ID: Josh is correct. For Pn-Pn-2, you can set the outflow pressure in the particular case where there is more than one exit. (I've only done this once in my life and would have to dig in a bit to recall out to do this..., but would be happy to do so if that's of value.) Paul On Thu, 6 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Zhu, > > As far as I understand it, boundary conditions for pressure aren't > exactly SET, more so as they are computed so that div u = 0 (as Paul > mentions in the note you included below). So, as far as I know, there > is no method to set, say, an inlet pressure with Nekton. > > Is this correct developers? (it's something I've always wondered myself) > > Josh > > On Thu, Oct 6, 2011 at 1:58 PM, wrote: >> Hi all, >> >> I am wondering if there is any reference on the pressure boundary condition >> for Pn-Pn case? I know Dr. Fisher has a 1997 JCP paper discussing the >> Pn-Pn-2 setting. >> >> Also what subroutines/functions handle with the boundary conditions for both >> velocity and pressure? Thank you, >> >> Zhu Wang >> Virginia Tech >> >> >> ---------------------------- >> Dear Fred, >> >> The pressure bcs are chosen to ensure div.U=0. >> >> For the Pn-Pn case, this involves addition of an appropriate inhomogeneity >> that cancels the b.l. divergence that arises from std. splitting >> formulations. >> >> For the Pn-Pn-2 case, there are no pressre bcs, only velocity bcs. >> >> Paul >> >> ---------------------------------- >> On Thu, 24 Jun 2010, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Dear NEKs, >>> >>> I wanted to ask which boundary condition is used for the pressure in nek?? >>> are all pressure gradients normal to wall set tto zero?? because in my >>> current simulation i have a pressure gradient normal to a wall and i am >>> wondering why!? >>> >>> best, fred >> --------------------------- >> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >> > > > > -- > Josh Camp > > "All that is necessary for the triumph of evil is that good men do > nothing" -- Edmund Burke > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Sat Oct 8 03:34:56 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sat, 8 Oct 2011 10:34:56 +0200 Subject: [Nek5000-users] Pn-Pn boundary condition In-Reply-To: References: <4E8DFA55.1020006@vt.edu> Message-ID: Hi all, The Pn-Pn solver is implemented in plan4.f. Here are some useful references: S. A. Orszag, M. Israeli, and M. O. Deville. Boundary Conditions for Incompressible Flows. Journal of Scientific Computing, 1:75?111, 1986. A.G. Tomboulides, J.C.Y. Lee, and S.A. Orzag. Numerical Simulation of Low Mach Number Reactive Flows. Journal of Scientific Computing, 1997:139?167, 12. J. L. Guermond. Nonstandard nonconforming approximation of the Stokes Problem, I: periodic boundary conditions. International Journal of Numerical Analysis and Modeling, 2:345?354, 2005. I am happy to take your questions anytime. Cheers Stefan On 10/6/11, nek5000-users at lists.mcs.anl.gov wrote: > > Josh is correct. > > For Pn-Pn-2, you can set the outflow pressure in the particular case > where there is more than one exit. (I've only done this once in my > life and would have to dig in a bit to recall out to do this..., but > would be happy to do so if that's of value.) > > Paul > > > > On Thu, 6 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Zhu, >> >> As far as I understand it, boundary conditions for pressure aren't >> exactly SET, more so as they are computed so that div u = 0 (as Paul >> mentions in the note you included below). So, as far as I know, there >> is no method to set, say, an inlet pressure with Nekton. >> >> Is this correct developers? (it's something I've always wondered myself) >> >> Josh >> >> On Thu, Oct 6, 2011 at 1:58 PM, wrote: >>> Hi all, >>> >>> I am wondering if there is any reference on the pressure boundary >>> condition >>> for Pn-Pn case? I know Dr. Fisher has a 1997 JCP paper discussing the >>> Pn-Pn-2 setting. >>> >>> Also what subroutines/functions handle with the boundary conditions for >>> both >>> velocity and pressure? Thank you, >>> >>> Zhu Wang >>> Virginia Tech >>> >>> >>> ---------------------------- >>> Dear Fred, >>> >>> The pressure bcs are chosen to ensure div.U=0. >>> >>> For the Pn-Pn case, this involves addition of an appropriate >>> inhomogeneity >>> that cancels the b.l. divergence that arises from std. splitting >>> formulations. >>> >>> For the Pn-Pn-2 case, there are no pressre bcs, only velocity bcs. >>> >>> Paul >>> >>> ---------------------------------- >>> On Thu, 24 Jun 2010, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> Dear NEKs, >>>> >>>> I wanted to ask which boundary condition is used for the pressure in >>>> nek?? >>>> are all pressure gradients normal to wall set tto zero?? because in my >>>> current simulation i have a pressure gradient normal to a wall and i am >>>> wondering why!? >>>> >>>> best, fred >>> --------------------------- >>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >>> >> >> >> >> -- >> Josh Camp >> >> "All that is necessary for the triumph of evil is that good men do >> nothing" -- Edmund Burke >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Sat Oct 8 14:38:46 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 9 Oct 2011 01:08:46 +0530 Subject: [Nek5000-users] NEK gets stuck Message-ID: Dear Nek devs, I'm having trouble with a run involving a box with dimensions 5:5:1 (length:breadth:height), for which I would like to go to a Rayleigh number of 10^8. I generated the grid (.rea and .map) using the attached .box file, but the run gets stuck at the point shown in the attached log. What could be the reason that NEK is stuck at that point? I suspect it has something to do with the grid or the .map file. Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: high_ray.box Type: application/octet-stream Size: 4065 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: high_ray.rea Type: application/octet-stream Size: 4880 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log Type: application/octet-stream Size: 65562 bytes Desc: not available URL: From nek5000-users at lists.mcs.anl.gov Sun Oct 9 01:09:44 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 9 Oct 2011 08:09:44 +0200 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Mani, What's your memory footprint? Any chance that you run out of memory? Did you try to run the same setup on a power of two processors? What happens if you disable the multigrid solver? Cheers, Stefan On 10/8/11, nek5000-users at lists.mcs.anl.gov wrote: > Dear Nek devs, > > I'm having trouble with a run involving a box with dimensions 5:5:1 > (length:breadth:height), for which I would like to go to a Rayleigh number > of 10^8. I generated the grid (.rea and .map) using the attached .box file, > but the run gets stuck at the point shown in the attached log. > > What could be the reason that NEK is stuck at that point? I suspect it has > something to do with the grid or the .map file. > > Thanks, > Mani > From nek5000-users at lists.mcs.anl.gov Sun Oct 9 05:48:07 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 9 Oct 2011 16:18:07 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, I'm sorry, but the machine is down at the moment, so I can't get the info you're asking for. It should be up in a couple of days, and then I'll post the info. Thanks for the reply. Regards, Mani On Sun, Oct 9, 2011 at 11:39 AM, wrote: > Hi Mani, > > What's your memory footprint? Any chance that you run out of memory? > Did you try to run the same setup on a power of two processors? What > happens if you disable the multigrid solver? > > Cheers, > Stefan > > On 10/8/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Dear Nek devs, > > > > I'm having trouble with a run involving a box with dimensions 5:5:1 > > (length:breadth:height), for which I would like to go to a Rayleigh > number > > of 10^8. I generated the grid (.rea and .map) using the attached .box > file, > > but the run gets stuck at the point shown in the attached log. > > > > What could be the reason that NEK is stuck at that point? I suspect it > has > > something to do with the grid or the .map file. > > > > Thanks, > > Mani > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Thu Oct 6 12:48:54 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 06 Oct 2011 13:48:54 -0400 Subject: [Nek5000-users] how to implement integration like (u-x, p_x) and (p_x, p_x) in Pn-Pn-2 setting Message-ID: <4E8DEA06.1030109@vt.edu> Hi NEKs, I am trying to evaluate integrations involving velocity /u/ and pressure /p/ in "userchk". But I am not sure what variables containing the right weights (mass) for integration in the Pn-Pn-2 setting. For example, for (/u_x, u_x/), I can use "bm1". However, for (/u_x, p_x/), what I shall use? so does (/p_x, p_x/)? Thank you, -- *Zhu Wang* Department of Mathematics Virginia Tech Web: http://www.math.vt.edu/people/wangzhu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 11:42:53 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 18:42:53 +0200 Subject: [Nek5000-users] how to implement integration like (u-x, p_x) and (p_x, p_x) in Pn-Pn-2 setting In-Reply-To: <4E8DEA06.1030109@vt.edu> References: <4E8DEA06.1030109@vt.edu> Message-ID: Hi, One option is to interpolate (map) the pressure onto the velocity nodes and use the mass matrix BM1. Cheers, Stefan On 10/6/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi NEKs, > > I am trying to evaluate integrations involving velocity /u/ and pressure > /p/ in "userchk". But I am not sure what variables containing the right > weights (mass) for integration in the Pn-Pn-2 setting. For example, for > (/u_x, u_x/), I can use "bm1". However, for (/u_x, p_x/), what I shall > use? so does (/p_x, p_x/)? > > Thank you, > -- > *Zhu Wang* > Department of Mathematics > Virginia Tech > Web: http://www.math.vt.edu/people/wangzhu/ > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 12:40:06 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 23:10:06 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, The machine is up now and I checked for the memory usage. The RAM usage is well within the installed memory. The same problem occurs even with power of two processors. How does one toggle the multigrid solver? Thanks, Mani > > On Sun, Oct 9, 2011 at 11:39 AM, wrote: > >> Hi Mani, >> >> What's your memory footprint? Any chance that you run out of memory? >> Did you try to run the same setup on a power of two processors? What >> happens if you disable the multigrid solver? >> >> Cheers, >> Stefan >> >> On 10/8/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Dear Nek devs, >> > >> > I'm having trouble with a run involving a box with dimensions 5:5:1 >> > (length:breadth:height), for which I would like to go to a Rayleigh >> number >> > of 10^8. I generated the grid (.rea and .map) using the attached .box >> file, >> > but the run gets stuck at the point shown in the attached log. >> > >> > What could be the reason that NEK is stuck at that point? I suspect it >> has >> > something to do with the grid or the .map file. >> > >> > Thanks, >> > Mani >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:02:45 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:02:45 +0200 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: I think you need to set p43=1. Just to be sure: What's the output of 'size nek5000' and how much memory per core do you have? On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > The machine is up now and I checked for the memory usage. The RAM usage is > well within the installed memory. The same problem occurs even with power of > two processors. > > How does one toggle the multigrid solver? > > Thanks, > Mani > > >> >> On Sun, Oct 9, 2011 at 11:39 AM, wrote: >> >>> Hi Mani, >>> >>> What's your memory footprint? Any chance that you run out of memory? >>> Did you try to run the same setup on a power of two processors? What >>> happens if you disable the multigrid solver? >>> >>> Cheers, >>> Stefan >>> >>> On 10/8/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>> > Dear Nek devs, >>> > >>> > I'm having trouble with a run involving a box with dimensions 5:5:1 >>> > (length:breadth:height), for which I would like to go to a Rayleigh >>> number >>> > of 10^8. I generated the grid (.rea and .map) using the attached .box >>> file, >>> > but the run gets stuck at the point shown in the attached log. >>> > >>> > What could be the reason that NEK is stuck at that point? I suspect it >>> has >>> > something to do with the grid or the .map file. >>> > >>> > Thanks, >>> > Mani >>> > >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:04:54 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 13:04:54 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries Message-ID: Hi neks, I would like to employ periodic boundary conditions on the sides of a slice of an annulus. I have constructed this domain by deforming the domain [0,1]x[0,1] in userdat2. Setting the initial domain up with periodic sides does not work. I`m guessing this is because nek is trying to mathc x- and y-velocities at the boundaries, when really I need to match velocities in the normal and parallel directions. Is the P boundary condition unable to handle matching non-parallel boundaries, and if so is there a work-around? If this makes it easier, I can settle for having only 1/4, 1/2 and 3/4 slices, so the BC`s amount to mappings of +/-u,v to +/- u,v. Thanks! David -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:13:13 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:13:13 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Hi David, What you need is a cyclic not periodic BC. They are similar but not the same. There is one important assumption in the current implementation: The normal vector of a cyclic boundary needs to be plane and orthogonal to the Z-axis. Then, just set IFCYCLIC to true in the logical switches of your rea-file and give it a whirl. Cheers, Stefan On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi neks, > > I would like to employ periodic boundary conditions on the sides of a slice > of an annulus. I have constructed this domain by deforming the domain > [0,1]x[0,1] in userdat2. Setting the initial domain up with periodic sides > does not work. I`m guessing this is because nek is trying to mathc x- and > y-velocities at the boundaries, when really I need to match velocities in > the normal and parallel directions. > > Is the P boundary condition unable to handle matching non-parallel > boundaries, and if so is there a work-around? > > If this makes it easier, I can settle for having only 1/4, 1/2 and 3/4 > slices, so the BC`s amount to mappings of +/-u,v to +/- u,v. > > Thanks! > > David > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:17:12 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 13:17:12 -0500 (CDT) Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Yes -- I think it also needs to be Pn-Pn ? (Is this correct Stefan?) Thanks! Paul On Mon, 10 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi David, > > What you need is a cyclic not periodic BC. They are similar but not > the same. There is one important assumption in the current > implementation: The normal vector of a cyclic boundary needs to be > plane and orthogonal to the Z-axis. Then, just set IFCYCLIC to true in > the logical switches of your rea-file and give it a whirl. > > Cheers, > Stefan > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > wrote: >> Hi neks, >> >> I would like to employ periodic boundary conditions on the sides of a slice >> of an annulus. I have constructed this domain by deforming the domain >> [0,1]x[0,1] in userdat2. Setting the initial domain up with periodic sides >> does not work. I`m guessing this is because nek is trying to mathc x- and >> y-velocities at the boundaries, when really I need to match velocities in >> the normal and parallel directions. >> >> Is the P boundary condition unable to handle matching non-parallel >> boundaries, and if so is there a work-around? >> >> If this makes it easier, I can settle for having only 1/4, 1/2 and 3/4 >> slices, so the BC`s amount to mappings of +/-u,v to +/- u,v. >> >> Thanks! >> >> David >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:23:39 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 23:53:39 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, The following is the output of 'size nek5000' text data bss dec hex filename 5163006 59896 333337824 338560726 142e06d6 nek5000 In the .rea file, p43 has been set to 0. Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:23:57 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:23:57 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: >From a quick scan of the code it looks like it would work for Pn/Pn-2 too. Not sure though. Is there anybody out there not using Pn/Pn? Just kidding :) Cheers Stefan On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > > Yes -- I think it also needs to be Pn-Pn ? (Is this correct Stefan?) > > Thanks! > > Paul > > > On Mon, 10 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi David, >> >> What you need is a cyclic not periodic BC. They are similar but not >> the same. There is one important assumption in the current >> implementation: The normal vector of a cyclic boundary needs to be >> plane and orthogonal to the Z-axis. Then, just set IFCYCLIC to true in >> the logical switches of your rea-file and give it a whirl. >> >> Cheers, >> Stefan >> >> On 10/10/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> Hi neks, >>> >>> I would like to employ periodic boundary conditions on the sides of a >>> slice >>> of an annulus. I have constructed this domain by deforming the domain >>> [0,1]x[0,1] in userdat2. Setting the initial domain up with periodic >>> sides >>> does not work. I`m guessing this is because nek is trying to mathc x- >>> and >>> y-velocities at the boundaries, when really I need to match velocities in >>> the normal and parallel directions. >>> >>> Is the P boundary condition unable to handle matching non-parallel >>> boundaries, and if so is there a work-around? >>> >>> If this makes it easier, I can settle for having only 1/4, 1/2 and 3/4 >>> slices, so the BC`s amount to mappings of +/-u,v to +/- u,v. >>> >>> Thanks! >>> >>> David >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:26:39 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:26:39 +0200 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: What's the memory size per core? Sure p43=0 is correct if you want to use the multilevel Schwarz solver. Just as a cross check: set p43=1 and try again. On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > The following is the output of 'size nek5000' > > text data bss dec hex filename > 5163006 59896 333337824 338560726 142e06d6 nek5000 > > > In the .rea file, p43 has been set to 0. > > Mani > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:31:48 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 13:31:48 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Hi Stephan, I see, thanks. I can`t find any documentation about cyclic BC`s. What is their symbol for the box file? Also, I don`t see a logical switch for IFCYCLIC. Do I just add another line? Also, I have stress-free BC on curved boundaries. A series of error messages led me to believe that I had to use the stress formulation, and that this required me to use Pn/Pn-2. Is this not the case? Best, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:38:52 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:38:52 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: You are right, it's an undocumented AND experimental feature. Just flag the boundaries as periodic. Yes, just add another IFCYCLIC line. Currently, there is no stress formulation support for Pn/Pn. -Stefan On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stephan, > > I see, thanks. I can`t find any documentation about cyclic BC`s. What is > their symbol for the box file? Also, I don`t see a logical switch for > IFCYCLIC. Do I just add another line? > > Also, I have stress-free BC on curved boundaries. A series of error > messages led me to believe that I had to use the stress formulation, and > that this required me to use Pn/Pn-2. Is this not the case? > > Best, > > David > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:41:25 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 20:41:25 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: btw: I am afraid, the stress formulation doesn't support cyclic BCs at the moment! -Stefan On 10/10/11, S K wrote: > You are right, it's an undocumented AND experimental feature. Just > flag the boundaries as periodic. Yes, just add another IFCYCLIC line. > Currently, there is no stress formulation support for Pn/Pn. > > -Stefan > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > wrote: >> Hi Stephan, >> >> I see, thanks. I can`t find any documentation about cyclic BC`s. What >> is >> their symbol for the box file? Also, I don`t see a logical switch for >> IFCYCLIC. Do I just add another line? >> >> Also, I have stress-free BC on curved boundaries. A series of error >> messages led me to believe that I had to use the stress formulation, and >> that this required me to use Pn/Pn-2. Is this not the case? >> >> Best, >> >> David >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 13:57:34 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 13:57:34 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Well then it seems I am stuck. Might a work-around be possible for a 1/4 or 1/2 annulus? For instance with 1/2, I would just need to match (u,v) to (-u,-v). David On Mon, Oct 10, 2011 at 1:41 PM, wrote: > btw: I am afraid, the stress formulation doesn't support cyclic BCs at > the moment! > -Stefan > > On 10/10/11, S K wrote: > > You are right, it's an undocumented AND experimental feature. Just > > flag the boundaries as periodic. Yes, just add another IFCYCLIC line. > > Currently, there is no stress formulation support for Pn/Pn. > > > > -Stefan > > > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > > wrote: > >> Hi Stephan, > >> > >> I see, thanks. I can`t find any documentation about cyclic BC`s. What > >> is > >> their symbol for the box file? Also, I don`t see a logical switch for > >> IFCYCLIC. Do I just add another line? > >> > >> Also, I have stress-free BC on curved boundaries. A series of error > >> messages led me to believe that I had to use the stress formulation, and > >> that this required me to use Pn/Pn-2. Is this not the case? > >> > >> Best, > >> > >> David > >> > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 14:12:00 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 21:12:00 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Let me check what we need to change to make it happen. -Stefan On 10/10/11, nek5000-users at lists.mcs.anl.gov wrote: > Well then it seems I am stuck. Might a work-around be possible for a 1/4 or > 1/2 annulus? For instance with 1/2, I would just need to match (u,v) to > (-u,-v). > > David > > On Mon, Oct 10, 2011 at 1:41 PM, wrote: > >> btw: I am afraid, the stress formulation doesn't support cyclic BCs at >> the moment! >> -Stefan >> >> On 10/10/11, S K wrote: >> > You are right, it's an undocumented AND experimental feature. Just >> > flag the boundaries as periodic. Yes, just add another IFCYCLIC line. >> > Currently, there is no stress formulation support for Pn/Pn. >> > >> > -Stefan >> > >> > On 10/10/11, nek5000-users at lists.mcs.anl.gov >> > wrote: >> >> Hi Stephan, >> >> >> >> I see, thanks. I can`t find any documentation about cyclic BC`s. What >> >> is >> >> their symbol for the box file? Also, I don`t see a logical switch for >> >> IFCYCLIC. Do I just add another line? >> >> >> >> Also, I have stress-free BC on curved boundaries. A series of error >> >> messages led me to believe that I had to use the stress formulation, >> >> and >> >> that this required me to use Pn/Pn-2. Is this not the case? >> >> >> >> Best, >> >> >> >> David >> >> >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 14:20:02 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 21:20:02 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Hmm, just had a look at the code. It looks to me that it might work out of the box. Why don't you give it a shot. Cheers Sefan On 10/10/11, S K wrote: > Let me check what we need to change to make it happen. > -Stefan > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > wrote: >> Well then it seems I am stuck. Might a work-around be possible for a 1/4 >> or >> 1/2 annulus? For instance with 1/2, I would just need to match (u,v) to >> (-u,-v). >> >> David >> >> On Mon, Oct 10, 2011 at 1:41 PM, wrote: >> >>> btw: I am afraid, the stress formulation doesn't support cyclic BCs at >>> the moment! >>> -Stefan >>> >>> On 10/10/11, S K wrote: >>> > You are right, it's an undocumented AND experimental feature. Just >>> > flag the boundaries as periodic. Yes, just add another IFCYCLIC line. >>> > Currently, there is no stress formulation support for Pn/Pn. >>> > >>> > -Stefan >>> > >>> > On 10/10/11, nek5000-users at lists.mcs.anl.gov >>> > wrote: >>> >> Hi Stephan, >>> >> >>> >> I see, thanks. I can`t find any documentation about cyclic BC`s. >>> >> What >>> >> is >>> >> their symbol for the box file? Also, I don`t see a logical switch >>> >> for >>> >> IFCYCLIC. Do I just add another line? >>> >> >>> >> Also, I have stress-free BC on curved boundaries. A series of error >>> >> messages led me to believe that I had to use the stress formulation, >>> >> and >>> >> that this required me to use Pn/Pn-2. Is this not the case? >>> >> >>> >> Best, >>> >> >>> >> David >>> >> >>> > >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 21:59:11 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 21:59:11 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: <539A9E28-1BF1-4189-B57D-7EAF4ED9DB6C@gmail.com> Hi Stefan, I tried the cyclic BC with Pn/Pn-2, but it didn't work. Visualizing the results, the velocities do not seem to be properly matched, and the fields quickly blow up. To make sure I was doing the cyclic BC right, I changed to no-slip and tried it with Pn/Pn, and that worked, so I think Paul may have been right about cyclic BC no working with Pn/Pn-2. Do you think there is a way around this? I unfortunately need the stress-free boundaries or something similar. Best, David On Oct 10, 2011, at 2:20 PM, nek5000-users at lists.mcs.anl.gov wrote: > Hmm, just had a look at the code. It looks to me that it might work > out of the box. Why don't you give it a shot. > > Cheers > Sefan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Mon Oct 10 22:16:02 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 22:16:02 -0500 (CDT) Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: <539A9E28-1BF1-4189-B57D-7EAF4ED9DB6C@gmail.com> References: <539A9E28-1BF1-4189-B57D-7EAF4ED9DB6C@gmail.com> Message-ID: Hi David, How much of a circular arc do you need? Is there a compelling reason to not use a full arc? Paul On Mon, 10 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > I tried the cyclic BC with Pn/Pn-2, but it didn't work. Visualizing the results, the velocities do not seem to be properly matched, and the fields quickly blow up. To make sure I was doing the cyclic BC right, I changed to no-slip and tried it with Pn/Pn, and that worked, so I think Paul may have been right about cyclic BC no working with Pn/Pn-2. Do you think there is a way around this? I unfortunately need the stress-free boundaries or something similar. > > Best, > > David > > > > > > On Oct 10, 2011, at 2:20 PM, nek5000-users at lists.mcs.anl.gov wrote: > >> Hmm, just had a look at the code. It looks to me that it might work >> out of the box. Why don't you give it a shot. >> >> Cheers >> Sefan >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 10 22:59:40 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 10 Oct 2011 22:59:40 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: <539A9E28-1BF1-4189-B57D-7EAF4ED9DB6C@gmail.com> Message-ID: <5A9B32A3-B700-4EDC-B624-6B7C9A506ADD@gmail.com> Hi Paul, I am studying a particular sort of 2D convective flow that occurs in periodic-side boxes. It is seen most easily with free-slip velocity. I would like to find solutions of this type in a full annulus also, but I have had no luck with the parameters and IC's I've tried thus far. So, I want to try a piece of an annulus first, where I expect to find what I want more easily. Then I would work my way up to a full annulus. As I said before, 1/4, 1/2 and 3/4 would probably suffice if that were easier to implement. But it sounds like this may be hard to do... Best, David On Oct 10, 2011, at 10:16 PM, nek5000-users at lists.mcs.anl.gov wrote: > > Hi David, > > How much of a circular arc do you need? > > Is there a compelling reason to not use a full arc? > > Paul > > > On Mon, 10 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi Stefan, >> >> I tried the cyclic BC with Pn/Pn-2, but it didn't work. Visualizing the results, the velocities do not seem to be properly matched, and the fields quickly blow up. To make sure I was doing the cyclic BC right, I changed to no-slip and tried it with Pn/Pn, and that worked, so I think Paul may have been right about cyclic BC no working with Pn/Pn-2. Do you think there is a way around this? I unfortunately need the stress-free boundaries or something similar. >> >> Best, >> >> David >> >> >> >> >> >> On Oct 10, 2011, at 2:20 PM, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Hmm, just had a look at the code. It looks to me that it might work >>> out of the box. Why don't you give it a shot. >>> >>> Cheers >>> Sefan >>> >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 11 00:51:12 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 11:21:12 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, Each node has 64 GB of RAM. There are 16 cores in each node. Each core has the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I tried running with p43=1 and it still gets stuck. The full specifications of each core are given below: vendor_id : GenuineIntel cpu family : 6 model : 15 model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz stepping : 11 cpu MHz : 2933.445 cache size : 4096 KB physical id : 6 siblings : 4 core id : 3 cpu cores : 4 fpu : yes fpu_exception : yes cpuid level : 10 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm bogomips : 5866.92 clflush size : 64 cache_alignment : 64 address sizes : 40 bits physical, 48 bits virtual Thanks, Mani On Mon, Oct 10, 2011 at 11:56 PM, wrote: > What's the memory size per core? > > Sure p43=0 is correct if you want to use the multilevel Schwarz > solver. Just as a cross check: set p43=1 and try again. > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Hi Stefan, > > > > The following is the output of 'size nek5000' > > > > text data bss dec hex filename > > 5163006 59896 333337824 338560726 142e06d6 nek5000 > > > > > > In the .rea file, p43 has been set to 0. > > > > Mani > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 08:36:33 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 15:36:33 +0200 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Doesn't sound like a memory problem given 4GB of memory per core and a total static data size of ~350MB (according to the output of size). The size of the executable doesn't matter in this case. - Can you post your logfile again (for the case where the SEMG was disabled). - What's the lowest number of processors you can reproduce the problem (try with lx1=4) -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > Each node has 64 GB of RAM. There are 16 cores in each node. Each core has > the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I > tried running with p43=1 and it still gets stuck. The full specifications of > each core are given below: > > vendor_id : GenuineIntel > cpu family : 6 > model : 15 > model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz > stepping : 11 > cpu MHz : 2933.445 > cache size : 4096 KB > physical id : 6 > siblings : 4 > core id : 3 > cpu cores : 4 > fpu : yes > fpu_exception : yes > cpuid level : 10 > wp : yes > flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca > cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm > constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm > bogomips : 5866.92 > clflush size : 64 > cache_alignment : 64 > address sizes : 40 bits physical, 48 bits virtual > > Thanks, > Mani > > > On Mon, Oct 10, 2011 at 11:56 PM, wrote: > >> What's the memory size per core? >> >> Sure p43=0 is correct if you want to use the multilevel Schwarz >> solver. Just as a cross check: set p43=1 and try again. >> >> On 10/10/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Hi Stefan, >> > >> > The following is the output of 'size nek5000' >> > >> > text data bss dec hex filename >> > 5163006 59896 333337824 338560726 142e06d6 nek5000 >> > >> > >> > In the .rea file, p43 has been set to 0. >> > >> > Mani >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 10:36:36 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 21:06:36 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, The problem starts to appear when I use 128 processors. It works with 64 processors. Mani On Tue, Oct 11, 2011 at 7:06 PM, wrote: > Doesn't sound like a memory problem given 4GB of memory per core and a > total static data size of ~350MB (according to the output of size). > The size of the executable doesn't matter in this case. > > - Can you post your logfile again (for the case where the SEMG was > disabled). > - What's the lowest number of processors you can reproduce the problem > (try with lx1=4) > > -Stefan > > On 10/11/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Hi Stefan, > > > > Each node has 64 GB of RAM. There are 16 cores in each node. Each core > has > > the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I > > tried running with p43=1 and it still gets stuck. The full specifications > of > > each core are given below: > > > > vendor_id : GenuineIntel > > cpu family : 6 > > model : 15 > > model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz > > stepping : 11 > > cpu MHz : 2933.445 > > cache size : 4096 KB > > physical id : 6 > > siblings : 4 > > core id : 3 > > cpu cores : 4 > > fpu : yes > > fpu_exception : yes > > cpuid level : 10 > > wp : yes > > flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge > mca > > cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm > > constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm > > bogomips : 5866.92 > > clflush size : 64 > > cache_alignment : 64 > > address sizes : 40 bits physical, 48 bits virtual > > > > Thanks, > > Mani > > > > > > On Mon, Oct 10, 2011 at 11:56 PM, > wrote: > > > >> What's the memory size per core? > >> > >> Sure p43=0 is correct if you want to use the multilevel Schwarz > >> solver. Just as a cross check: set p43=1 and try again. > >> > >> On 10/10/11, nek5000-users at lists.mcs.anl.gov > >> wrote: > >> > Hi Stefan, > >> > > >> > The following is the output of 'size nek5000' > >> > > >> > text data bss dec hex filename > >> > 5163006 59896 333337824 338560726 142e06d6 > nek5000 > >> > > >> > > >> > In the .rea file, p43 has been set to 0. > >> > > >> > Mani > >> > > >> _______________________________________________ > >> Nek5000-users mailing list > >> Nek5000-users at lists.mcs.anl.gov > >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >> > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 10:37:32 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 21:07:32 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: I sent the logs, but they're awaiting moderator permission to be posted on the list. On Tue, Oct 11, 2011 at 9:06 PM, Mani Chandra wrote: > Hi Stefan, > > The problem starts to appear when I use 128 processors. It works with 64 > processors. > > Mani > > On Tue, Oct 11, 2011 at 7:06 PM, wrote: > >> Doesn't sound like a memory problem given 4GB of memory per core and a >> total static data size of ~350MB (according to the output of size). >> The size of the executable doesn't matter in this case. >> >> - Can you post your logfile again (for the case where the SEMG was >> disabled). >> - What's the lowest number of processors you can reproduce the problem >> (try with lx1=4) >> >> -Stefan >> >> On 10/11/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Hi Stefan, >> > >> > Each node has 64 GB of RAM. There are 16 cores in each node. Each core >> has >> > the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I >> > tried running with p43=1 and it still gets stuck. The full >> specifications of >> > each core are given below: >> > >> > vendor_id : GenuineIntel >> > cpu family : 6 >> > model : 15 >> > model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz >> > stepping : 11 >> > cpu MHz : 2933.445 >> > cache size : 4096 KB >> > physical id : 6 >> > siblings : 4 >> > core id : 3 >> > cpu cores : 4 >> > fpu : yes >> > fpu_exception : yes >> > cpuid level : 10 >> > wp : yes >> > flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge >> mca >> > cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm >> > constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm >> > bogomips : 5866.92 >> > clflush size : 64 >> > cache_alignment : 64 >> > address sizes : 40 bits physical, 48 bits virtual >> > >> > Thanks, >> > Mani >> > >> > >> > On Mon, Oct 10, 2011 at 11:56 PM, >> wrote: >> > >> >> What's the memory size per core? >> >> >> >> Sure p43=0 is correct if you want to use the multilevel Schwarz >> >> solver. Just as a cross check: set p43=1 and try again. >> >> >> >> On 10/10/11, nek5000-users at lists.mcs.anl.gov >> >> wrote: >> >> > Hi Stefan, >> >> > >> >> > The following is the output of 'size nek5000' >> >> > >> >> > text data bss dec hex filename >> >> > 5163006 59896 333337824 338560726 142e06d6 >> nek5000 >> >> > >> >> > >> >> > In the .rea file, p43 has been set to 0. >> >> > >> >> > Mani >> >> > >> >> _______________________________________________ >> >> Nek5000-users mailing list >> >> Nek5000-users at lists.mcs.anl.gov >> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >> >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 12:51:31 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 12:51:31 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Hi Stefan, I`ve got another question related to the periodic/cyclic BC`s. In addition to the partial 2D annulus, I`m also trying to simulate a 3D annular cylinder (a full 2? radians for now). I made the domain as I made the full 2D annulus, by remapping a periodic-side domain in userdat2. This worked in 2D, but in 3D the periodic BC are not working properly. Is this a case where cyclic BC would be needed, rather than periodic? I`m hoping not since I have stress-free walls, so I`m using Pn/Pn-2 again. (I tried the same setup with no-slip BC in the Pn/Pn formulation, and that did work properly.) Thanks, David P.S. When I tried the 2D partial-annulus case with Pn/Pn, I accidentally had IFCYCLIC=F one time, and it still worked, so I can`t tell whether the cyclic feature is having an effect at all. On Mon, Oct 10, 2011 at 2:20 PM, wrote: > Hmm, just had a look at the code. It looks to me that it might work > out of the box. Why don't you give it a shot. > > Cheers > Sefan > > On 10/10/11, S K wrote: > > Let me check what we need to change to make it happen. > > -Stefan > > > > On 10/10/11, nek5000-users at lists.mcs.anl.gov > > wrote: > >> Well then it seems I am stuck. Might a work-around be possible for a > 1/4 > >> or > >> 1/2 annulus? For instance with 1/2, I would just need to match (u,v) to > >> (-u,-v). > >> > >> David > >> > >> On Mon, Oct 10, 2011 at 1:41 PM, > wrote: > >> > >>> btw: I am afraid, the stress formulation doesn't support cyclic BCs at > >>> the moment! > >>> -Stefan > >>> > >>> On 10/10/11, S K wrote: > >>> > You are right, it's an undocumented AND experimental feature. Just > >>> > flag the boundaries as periodic. Yes, just add another IFCYCLIC line. > >>> > Currently, there is no stress formulation support for Pn/Pn. > >>> > > >>> > -Stefan > >>> > > >>> > On 10/10/11, nek5000-users at lists.mcs.anl.gov > >>> > wrote: > >>> >> Hi Stephan, > >>> >> > >>> >> I see, thanks. I can`t find any documentation about cyclic BC`s. > >>> >> What > >>> >> is > >>> >> their symbol for the box file? Also, I don`t see a logical switch > >>> >> for > >>> >> IFCYCLIC. Do I just add another line? > >>> >> > >>> >> Also, I have stress-free BC on curved boundaries. A series of error > >>> >> messages led me to believe that I had to use the stress formulation, > >>> >> and > >>> >> that this required me to use Pn/Pn-2. Is this not the case? > >>> >> > >>> >> Best, > >>> >> > >>> >> David > >>> >> > >>> > > >>> _______________________________________________ > >>> Nek5000-users mailing list > >>> Nek5000-users at lists.mcs.anl.gov > >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >>> > >> > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 12:54:06 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 13:54:06 -0400 Subject: [Nek5000-users] Error when running the vortex problem Message-ID: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> Hi, I was trying to run nek5000 with vortex, following the default user setting. There is no problem for building, however when I start to run the application, there is an error "NONSTD HDR, parse_hdr, abort". The following is the related error report. Reading checkpoint data 0 0 OPEN: r1492_n08.fld27 byte swap: T 0.000000 0.000000 140 8 8 8 5.0101000E+0250101 U P 4 NELT,NX,N ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ NONSTD HDR, parse_hdr, abort. I briefly read the code. It looks like the code is reading the checkpoint file "r1492_n08.fld27" and there is something wrong with the file header. The file header does not follow the standard format and the code cannot understand the file. How can I solve this problem? Thank you. -Dong -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 12:58:15 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 19:58:15 +0200 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: David, What do you mean by "remapping a periodic-side domain in userdat2"? Sure, even with IFCYCLIC=F you'll get a result. But is probably not the one you had in mind, simply because the BCs are different i.e. periodic vs cyclic. -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > I`ve got another question related to the periodic/cyclic BC`s. In addition > to the partial 2D annulus, I`m also trying to simulate a 3D annular cylinder > (a full 2? radians for now). I made the domain as I made the full 2D > annulus, by remapping a periodic-side domain in userdat2. This worked in > 2D, but in 3D the periodic BC are not working properly. Is this a case > where cyclic BC would be needed, rather than periodic? I`m hoping not since > I have stress-free walls, so I`m using Pn/Pn-2 again. (I tried the same > setup with no-slip BC in the Pn/Pn formulation, and that did work properly.) > > Thanks, > > David > > P.S. When I tried the 2D partial-annulus case with Pn/Pn, I accidentally had > IFCYCLIC=F one time, and it still worked, so I can`t tell whether the cyclic > feature is having an effect at all. > > > On Mon, Oct 10, 2011 at 2:20 PM, wrote: > >> Hmm, just had a look at the code. It looks to me that it might work >> out of the box. Why don't you give it a shot. >> >> Cheers >> Sefan >> >> On 10/10/11, S K wrote: >> > Let me check what we need to change to make it happen. >> > -Stefan >> > >> > On 10/10/11, nek5000-users at lists.mcs.anl.gov >> > wrote: >> >> Well then it seems I am stuck. Might a work-around be possible for a >> 1/4 >> >> or >> >> 1/2 annulus? For instance with 1/2, I would just need to match (u,v) >> >> to >> >> (-u,-v). >> >> >> >> David >> >> >> >> On Mon, Oct 10, 2011 at 1:41 PM, >> wrote: >> >> >> >>> btw: I am afraid, the stress formulation doesn't support cyclic BCs at >> >>> the moment! >> >>> -Stefan >> >>> >> >>> On 10/10/11, S K wrote: >> >>> > You are right, it's an undocumented AND experimental feature. Just >> >>> > flag the boundaries as periodic. Yes, just add another IFCYCLIC >> >>> > line. >> >>> > Currently, there is no stress formulation support for Pn/Pn. >> >>> > >> >>> > -Stefan >> >>> > >> >>> > On 10/10/11, nek5000-users at lists.mcs.anl.gov >> >>> > wrote: >> >>> >> Hi Stephan, >> >>> >> >> >>> >> I see, thanks. I can`t find any documentation about cyclic BC`s. >> >>> >> What >> >>> >> is >> >>> >> their symbol for the box file? Also, I don`t see a logical switch >> >>> >> for >> >>> >> IFCYCLIC. Do I just add another line? >> >>> >> >> >>> >> Also, I have stress-free BC on curved boundaries. A series of >> >>> >> error >> >>> >> messages led me to believe that I had to use the stress >> >>> >> formulation, >> >>> >> and >> >>> >> that this required me to use Pn/Pn-2. Is this not the case? >> >>> >> >> >>> >> Best, >> >>> >> >> >>> >> David >> >>> >> >> >>> > >> >>> _______________________________________________ >> >>> Nek5000-users mailing list >> >>> Nek5000-users at lists.mcs.anl.gov >> >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >>> >> >> >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 13:08:43 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 20:08:43 +0200 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> Message-ID: Hi Dong, It works for me. Did you change anything? -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi, > I was trying to run nek5000 with vortex, following the default user setting. > There is no problem for building, however when I start to run the > application, there is an error "NONSTD HDR, parse_hdr, abort". The following > is the related error report. > > Reading checkpoint data > 0 0 OPEN: r1492_n08.fld27 > byte swap: T 0.000000 0.000000 > 140 8 8 8 5.0101000E+0250101 U P 4 > NELT,NX,N > ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ > > NONSTD HDR, parse_hdr, abort. > > I briefly read the code. It looks like the code is reading the checkpoint > file "r1492_n08.fld27" and there is something wrong with the file header. > The file header does not follow the standard format and the code cannot > understand the file. > > How can I solve this problem? > > Thank you. > > -Dong > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 13:14:48 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 20:14:48 +0200 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> Message-ID: Please post a full log file. Note the file size is limited to 100KB. -Stefan On 10/11/11, S K wrote: > Hi Dong, > > It works for me. Did you change anything? > > -Stefan > > On 10/11/11, nek5000-users at lists.mcs.anl.gov > wrote: >> Hi, >> I was trying to run nek5000 with vortex, following the default user >> setting. >> There is no problem for building, however when I start to run the >> application, there is an error "NONSTD HDR, parse_hdr, abort". The >> following >> is the related error report. >> >> Reading checkpoint data >> 0 0 OPEN: r1492_n08.fld27 >> byte swap: T 0.000000 0.000000 >> 140 8 8 8 5.0101000E+0250101 U P 4 >> NELT,NX,N >> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >> >> NONSTD HDR, parse_hdr, abort. >> >> I briefly read the code. It looks like the code is reading the checkpoint >> file "r1492_n08.fld27" and there is something wrong with the file header. >> The file header does not follow the standard format and the code cannot >> understand the file. >> >> How can I solve this problem? >> >> Thank you. >> >> -Dong >> > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 13:28:00 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:28:00 -0400 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> Message-ID: <977E8EBAA2E5E946A9311B0F429730FA6F59C838FB@EXCHMBB.ornl.gov> Hi, Stefan. Thanks for the reply. I double-checked the code and made sure I didn't change anything. I found the error happens in the file "nek/ic.f" (line 2223). Please see the following. if (indx2(hdr,132,'#std',4).eq.1) then call parse_std_hdr(hdr) else if (nid.eq.0) write(6,80) hdr if (nid.eq.0) write(6,80) 'NONSTD HDR, parse_hdr, abort.' 80 format(a132) call exitt endif I looks like the code cannot find the keyword "#std" from the header. I read the file header shown in the following. 140 8 8 8 5.0101000E+0250101 U P 4 NELT,NX,N The header does not contain the keyword... I guess that is the error reason. -Dong -----Original Message----- From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov Sent: Tuesday, October 11, 2011 2:09 PM To: nek5000-users at lists.mcs.anl.gov Subject: Re: [Nek5000-users] Error when running the vortex problem Hi Dong, It works for me. Did you change anything? -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi, > I was trying to run nek5000 with vortex, following the default user setting. > There is no problem for building, however when I start to run the > application, there is an error "NONSTD HDR, parse_hdr, abort". The following > is the related error report. > > Reading checkpoint data > 0 0 OPEN: r1492_n08.fld27 > byte swap: T 0.000000 0.000000 > 140 8 8 8 5.0101000E+0250101 U P 4 > NELT,NX,N > ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ > > NONSTD HDR, parse_hdr, abort. > > I briefly read the code. It looks like the code is reading the checkpoint > file "r1492_n08.fld27" and there is something wrong with the file header. > The file header does not follow the standard format and the code cannot > understand the file. > > How can I solve this problem? > > Thank you. > > -Dong > _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 11 13:32:52 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:32:52 -0400 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> Message-ID: <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> Hi, Stefan. Sure. Attached is the log file. -Dong -----Original Message----- From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov Sent: Tuesday, October 11, 2011 2:15 PM To: nek5000-users at lists.mcs.anl.gov Subject: Re: [Nek5000-users] Error when running the vortex problem Please post a full log file. Note the file size is limited to 100KB. -Stefan On 10/11/11, S K wrote: > Hi Dong, > > It works for me. Did you change anything? > > -Stefan > > On 10/11/11, nek5000-users at lists.mcs.anl.gov > wrote: >> Hi, >> I was trying to run nek5000 with vortex, following the default user >> setting. >> There is no problem for building, however when I start to run the >> application, there is an error "NONSTD HDR, parse_hdr, abort". The >> following >> is the related error report. >> >> Reading checkpoint data >> 0 0 OPEN: r1492_n08.fld27 >> byte swap: T 0.000000 0.000000 >> 140 8 8 8 5.0101000E+0250101 U P 4 >> NELT,NX,N >> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >> >> NONSTD HDR, parse_hdr, abort. >> >> I briefly read the code. It looks like the code is reading the checkpoint >> file "r1492_n08.fld27" and there is something wrong with the file header. >> The file header does not follow the standard format and the code cannot >> understand the file. >> >> How can I solve this problem? >> >> Thank you. >> >> -Dong >> > _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users -------------- next part -------------- A non-text attachment was scrubbed... Name: logfile Type: application/octet-stream Size: 8574 bytes Desc: logfile URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 13:42:16 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 13:42:16 -0500 (CDT) Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> Message-ID: Hi Dong, What kind of platform are you running on? Can you do an svn update in the example directory, just in case the file was contaminated on transfer? Paul On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi, Stefan. > Sure. Attached is the log file. > > -Dong > > -----Original Message----- > From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 11, 2011 2:15 PM > To: nek5000-users at lists.mcs.anl.gov > Subject: Re: [Nek5000-users] Error when running the vortex problem > > Please post a full log file. Note the file size is limited to 100KB. > -Stefan > > On 10/11/11, S K wrote: >> Hi Dong, >> >> It works for me. Did you change anything? >> >> -Stefan >> >> On 10/11/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> Hi, >>> I was trying to run nek5000 with vortex, following the default user >>> setting. >>> There is no problem for building, however when I start to run the >>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>> following >>> is the related error report. >>> >>> Reading checkpoint data >>> 0 0 OPEN: r1492_n08.fld27 >>> byte swap: T 0.000000 0.000000 >>> 140 8 8 8 5.0101000E+0250101 U P 4 >>> NELT,NX,N >>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>> >>> NONSTD HDR, parse_hdr, abort. >>> >>> I briefly read the code. It looks like the code is reading the checkpoint >>> file "r1492_n08.fld27" and there is something wrong with the file header. >>> The file header does not follow the standard format and the code cannot >>> understand the file. >>> >>> How can I solve this problem? >>> >>> Thank you. >>> >>> -Dong >>> >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:03:01 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:03:01 -0500 (CDT) Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> Message-ID: Hi Dong, For some reason, your run is traversing the .f00000 file format branch, rather than the .fld input branch. This traversal is generally controlled by setting param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. I'm wondering if you are compiling with r1854.usr provided in the examples/vortex directory? Paul On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > > Hi Dong, > > What kind of platform are you running on? > > Can you do an svn update in the example directory, just in case > the file was contaminated on transfer? > > Paul > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi, Stefan. >> Sure. Attached is the log file. >> >> -Dong >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.mcs.anl.gov >> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >> nek5000-users at lists.mcs.anl.gov >> Sent: Tuesday, October 11, 2011 2:15 PM >> To: nek5000-users at lists.mcs.anl.gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> Please post a full log file. Note the file size is limited to 100KB. >> -Stefan >> >> On 10/11/11, S K wrote: >>> Hi Dong, >>> >>> It works for me. Did you change anything? >>> >>> -Stefan >>> >>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>>> Hi, >>>> I was trying to run nek5000 with vortex, following the default user >>>> setting. >>>> There is no problem for building, however when I start to run the >>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>> following >>>> is the related error report. >>>> >>>> Reading checkpoint data >>>> 0 0 OPEN: r1492_n08.fld27 >>>> byte swap: T 0.000000 0.000000 >>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>> NELT,NX,N >>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>> >>>> NONSTD HDR, parse_hdr, abort. >>>> >>>> I briefly read the code. It looks like the code is reading the checkpoint >>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>> The file header does not follow the standard format and the code cannot >>>> understand the file. >>>> >>>> How can I solve this problem? >>>> >>>> Thank you. >>>> >>>> -Dong >>>> >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi, Stefan. > Sure. Attached is the log file. > > -Dong > > -----Original Message----- > From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 11, 2011 2:15 PM > To: nek5000-users at lists.mcs.anl.gov > Subject: Re: [Nek5000-users] Error when running the vortex problem > > Please post a full log file. Note the file size is limited to 100KB. > -Stefan > > On 10/11/11, S K wrote: >> Hi Dong, >> >> It works for me. Did you change anything? >> >> -Stefan >> >> On 10/11/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> Hi, >>> I was trying to run nek5000 with vortex, following the default user >>> setting. >>> There is no problem for building, however when I start to run the >>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>> following >>> is the related error report. >>> >>> Reading checkpoint data >>> 0 0 OPEN: r1492_n08.fld27 >>> byte swap: T 0.000000 0.000000 >>> 140 8 8 8 5.0101000E+0250101 U P 4 >>> NELT,NX,N >>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>> >>> NONSTD HDR, parse_hdr, abort. >>> >>> I briefly read the code. It looks like the code is reading the checkpoint >>> file "r1492_n08.fld27" and there is something wrong with the file header. >>> The file header does not follow the standard format and the code cannot >>> understand the file. >>> >>> How can I solve this problem? >>> >>> Thank you. >>> >>> -Dong >>> >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:28:01 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 15:28:01 -0400 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> Message-ID: <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> Hi, Paul. Thanks for the reply. Yeah. I was compiling the code in the examples/vortex directory. The difference is that it is the file "r1854a.usr" included, not "r1854.usr". But, even in the file "r1854a.usr", the "param(66)=4" is already well set. I am still trying to figure out why my run follows a different file format. Your further suggestion is very welcome. BTW, my platform is Linux/gnu. Thanks. -Dong -----Original Message----- From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov Sent: Tuesday, October 11, 2011 3:03 PM To: nek5000-users at lists.mcs.anl.gov Subject: Re: [Nek5000-users] Error when running the vortex problem Hi Dong, For some reason, your run is traversing the .f00000 file format branch, rather than the .fld input branch. This traversal is generally controlled by setting param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. I'm wondering if you are compiling with r1854.usr provided in the examples/vortex directory? Paul On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > > Hi Dong, > > What kind of platform are you running on? > > Can you do an svn update in the example directory, just in case > the file was contaminated on transfer? > > Paul > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi, Stefan. >> Sure. Attached is the log file. >> >> -Dong >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.mcs.anl.gov >> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >> nek5000-users at lists.mcs.anl.gov >> Sent: Tuesday, October 11, 2011 2:15 PM >> To: nek5000-users at lists.mcs.anl.gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> Please post a full log file. Note the file size is limited to 100KB. >> -Stefan >> >> On 10/11/11, S K wrote: >>> Hi Dong, >>> >>> It works for me. Did you change anything? >>> >>> -Stefan >>> >>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>>> Hi, >>>> I was trying to run nek5000 with vortex, following the default user >>>> setting. >>>> There is no problem for building, however when I start to run the >>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>> following >>>> is the related error report. >>>> >>>> Reading checkpoint data >>>> 0 0 OPEN: r1492_n08.fld27 >>>> byte swap: T 0.000000 0.000000 >>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>> NELT,NX,N >>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>> >>>> NONSTD HDR, parse_hdr, abort. >>>> >>>> I briefly read the code. It looks like the code is reading the checkpoint >>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>> The file header does not follow the standard format and the code cannot >>>> understand the file. >>>> >>>> How can I solve this problem? >>>> >>>> Thank you. >>>> >>>> -Dong >>>> >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi, Stefan. > Sure. Attached is the log file. > > -Dong > > -----Original Message----- > From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 11, 2011 2:15 PM > To: nek5000-users at lists.mcs.anl.gov > Subject: Re: [Nek5000-users] Error when running the vortex problem > > Please post a full log file. Note the file size is limited to 100KB. > -Stefan > > On 10/11/11, S K wrote: >> Hi Dong, >> >> It works for me. Did you change anything? >> >> -Stefan >> >> On 10/11/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> Hi, >>> I was trying to run nek5000 with vortex, following the default user >>> setting. >>> There is no problem for building, however when I start to run the >>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>> following >>> is the related error report. >>> >>> Reading checkpoint data >>> 0 0 OPEN: r1492_n08.fld27 >>> byte swap: T 0.000000 0.000000 >>> 140 8 8 8 5.0101000E+0250101 U P 4 >>> NELT,NX,N >>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>> >>> NONSTD HDR, parse_hdr, abort. >>> >>> I briefly read the code. It looks like the code is reading the checkpoint >>> file "r1492_n08.fld27" and there is something wrong with the file header. >>> The file header does not follow the standard format and the code cannot >>> understand the file. >>> >>> How can I solve this problem? >>> >>> Thank you. >>> >>> -Dong >>> >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:34:46 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:34:46 -0500 (CDT) Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> Message-ID: OK... well, this is curious... I'll start thinking. Thanks for taking the time to check and sorry that it's not working straight away. Paul On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi, Paul. > Thanks for the reply. > Yeah. I was compiling the code in the examples/vortex directory. The difference is that it is the file "r1854a.usr" included, not "r1854.usr". But, even in the file "r1854a.usr", the "param(66)=4" is already well set. > I am still trying to figure out why my run follows a different file format. Your further suggestion is very welcome. > > BTW, my platform is Linux/gnu. > > Thanks. > -Dong > > > > -----Original Message----- > From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 11, 2011 3:03 PM > To: nek5000-users at lists.mcs.anl.gov > Subject: Re: [Nek5000-users] Error when running the vortex problem > > > > Hi Dong, > > For some reason, your run is traversing the .f00000 file format > branch, rather than the .fld input branch. > > This traversal is generally controlled by setting > param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. > > I'm wondering if you are compiling with r1854.usr provided in > the examples/vortex directory? > > Paul > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> >> Hi Dong, >> >> What kind of platform are you running on? >> >> Can you do an svn update in the example directory, just in case >> the file was contaminated on transfer? >> >> Paul >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Hi, Stefan. >>> Sure. Attached is the log file. >>> >>> -Dong >>> >>> -----Original Message----- >>> From: nek5000-users-bounces at lists.mcs.anl.gov >>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>> nek5000-users at lists.mcs.anl.gov >>> Sent: Tuesday, October 11, 2011 2:15 PM >>> To: nek5000-users at lists.mcs.anl.gov >>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>> >>> Please post a full log file. Note the file size is limited to 100KB. >>> -Stefan >>> >>> On 10/11/11, S K wrote: >>>> Hi Dong, >>>> >>>> It works for me. Did you change anything? >>>> >>>> -Stefan >>>> >>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>> wrote: >>>>> Hi, >>>>> I was trying to run nek5000 with vortex, following the default user >>>>> setting. >>>>> There is no problem for building, however when I start to run the >>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>> following >>>>> is the related error report. >>>>> >>>>> Reading checkpoint data >>>>> 0 0 OPEN: r1492_n08.fld27 >>>>> byte swap: T 0.000000 0.000000 >>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>> NELT,NX,N >>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>> >>>>> NONSTD HDR, parse_hdr, abort. >>>>> >>>>> I briefly read the code. It looks like the code is reading the checkpoint >>>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>>> The file header does not follow the standard format and the code cannot >>>>> understand the file. >>>>> >>>>> How can I solve this problem? >>>>> >>>>> Thank you. >>>>> >>>>> -Dong >>>>> >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi, Stefan. >> Sure. Attached is the log file. >> >> -Dong >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov >> Sent: Tuesday, October 11, 2011 2:15 PM >> To: nek5000-users at lists.mcs.anl.gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> Please post a full log file. Note the file size is limited to 100KB. >> -Stefan >> >> On 10/11/11, S K wrote: >>> Hi Dong, >>> >>> It works for me. Did you change anything? >>> >>> -Stefan >>> >>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>>> Hi, >>>> I was trying to run nek5000 with vortex, following the default user >>>> setting. >>>> There is no problem for building, however when I start to run the >>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>> following >>>> is the related error report. >>>> >>>> Reading checkpoint data >>>> 0 0 OPEN: r1492_n08.fld27 >>>> byte swap: T 0.000000 0.000000 >>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>> NELT,NX,N >>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>> >>>> NONSTD HDR, parse_hdr, abort. >>>> >>>> I briefly read the code. It looks like the code is reading the checkpoint >>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>> The file header does not follow the standard format and the code cannot >>>> understand the file. >>>> >>>> How can I solve this problem? >>>> >>>> Thank you. >>>> >>>> -Dong >>>> >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:36:40 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:36:40 -0500 (CDT) Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> Message-ID: PS -- It should look like gs_setup: 0 unique labels shared done :: setup h1 coarse grid 6.9365978240966797E-002 sec call usrdat3 done :: usrdat3 set initial conditions Checking restart options: r1492_n08.fld27 time=0. Reading checkpoint data byte swap: F 6.543210 -2.9312772E+35 Read mode: 4.000000000000000 neltr,nxr,nyr,nzr: 140 8 8 8 Restarting from file r1492_n08.fld27 Columns for restart data U,V,W,P,T,S,N: 0 2 3 4 0 0 4 Reading 1 Successfully read data from dump number 1. On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi, Paul. > Thanks for the reply. > Yeah. I was compiling the code in the examples/vortex directory. The difference is that it is the file "r1854a.usr" included, not "r1854.usr". But, even in the file "r1854a.usr", the "param(66)=4" is already well set. > I am still trying to figure out why my run follows a different file format. Your further suggestion is very welcome. > > BTW, my platform is Linux/gnu. > > Thanks. > -Dong > > > > -----Original Message----- > From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 11, 2011 3:03 PM > To: nek5000-users at lists.mcs.anl.gov > Subject: Re: [Nek5000-users] Error when running the vortex problem > > > > Hi Dong, > > For some reason, your run is traversing the .f00000 file format > branch, rather than the .fld input branch. > > This traversal is generally controlled by setting > param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. > > I'm wondering if you are compiling with r1854.usr provided in > the examples/vortex directory? > > Paul > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> >> Hi Dong, >> >> What kind of platform are you running on? >> >> Can you do an svn update in the example directory, just in case >> the file was contaminated on transfer? >> >> Paul >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Hi, Stefan. >>> Sure. Attached is the log file. >>> >>> -Dong >>> >>> -----Original Message----- >>> From: nek5000-users-bounces at lists.mcs.anl.gov >>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>> nek5000-users at lists.mcs.anl.gov >>> Sent: Tuesday, October 11, 2011 2:15 PM >>> To: nek5000-users at lists.mcs.anl.gov >>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>> >>> Please post a full log file. Note the file size is limited to 100KB. >>> -Stefan >>> >>> On 10/11/11, S K wrote: >>>> Hi Dong, >>>> >>>> It works for me. Did you change anything? >>>> >>>> -Stefan >>>> >>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>> wrote: >>>>> Hi, >>>>> I was trying to run nek5000 with vortex, following the default user >>>>> setting. >>>>> There is no problem for building, however when I start to run the >>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>> following >>>>> is the related error report. >>>>> >>>>> Reading checkpoint data >>>>> 0 0 OPEN: r1492_n08.fld27 >>>>> byte swap: T 0.000000 0.000000 >>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>> NELT,NX,N >>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>> >>>>> NONSTD HDR, parse_hdr, abort. >>>>> >>>>> I briefly read the code. It looks like the code is reading the checkpoint >>>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>>> The file header does not follow the standard format and the code cannot >>>>> understand the file. >>>>> >>>>> How can I solve this problem? >>>>> >>>>> Thank you. >>>>> >>>>> -Dong >>>>> >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi, Stefan. >> Sure. Attached is the log file. >> >> -Dong >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.mcs.anl.gov [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of nek5000-users at lists.mcs.anl.gov >> Sent: Tuesday, October 11, 2011 2:15 PM >> To: nek5000-users at lists.mcs.anl.gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> Please post a full log file. Note the file size is limited to 100KB. >> -Stefan >> >> On 10/11/11, S K wrote: >>> Hi Dong, >>> >>> It works for me. Did you change anything? >>> >>> -Stefan >>> >>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>>> Hi, >>>> I was trying to run nek5000 with vortex, following the default user >>>> setting. >>>> There is no problem for building, however when I start to run the >>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>> following >>>> is the related error report. >>>> >>>> Reading checkpoint data >>>> 0 0 OPEN: r1492_n08.fld27 >>>> byte swap: T 0.000000 0.000000 >>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>> NELT,NX,N >>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>> >>>> NONSTD HDR, parse_hdr, abort. >>>> >>>> I briefly read the code. It looks like the code is reading the checkpoint >>>> file "r1492_n08.fld27" and there is something wrong with the file header. >>>> The file header does not follow the standard format and the code cannot >>>> understand the file. >>>> >>>> How can I solve this problem? >>>> >>>> Thank you. >>>> >>>> -Dong >>>> >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:50:17 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 14:50:17 -0500 (CDT) Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> Message-ID: Hi Dong, which fortran compiler are you using ? (also, which C compiler?) Paul On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > > PS -- > > It should look like > > > > gs_setup: 0 unique labels shared > done :: setup h1 coarse grid 6.9365978240966797E-002 sec > > call usrdat3 > done :: usrdat3 > > set initial conditions > Checking restart options: r1492_n08.fld27 time=0. > Reading checkpoint data > byte swap: F 6.543210 -2.9312772E+35 > Read mode: 4.000000000000000 neltr,nxr,nyr,nzr: 140 8 8 8 > > Restarting from file r1492_n08.fld27 > Columns for restart data U,V,W,P,T,S,N: 0 2 3 4 0 0 4 > Reading 1 > Successfully read data from dump number 1. > > > > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Hi, Paul. >> Thanks for the reply. >> Yeah. I was compiling the code in the examples/vortex directory. The >> difference is that it is the file "r1854a.usr" included, not "r1854.usr". >> But, even in the file "r1854a.usr", the "param(66)=4" is already well set. >> I am still trying to figure out why my run follows a different file format. >> Your further suggestion is very welcome. >> >> BTW, my platform is Linux/gnu. >> >> Thanks. >> -Dong >> >> >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.mcs.anl.gov >> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >> nek5000-users at lists.mcs.anl.gov >> Sent: Tuesday, October 11, 2011 3:03 PM >> To: nek5000-users at lists.mcs.anl.gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> >> >> Hi Dong, >> >> For some reason, your run is traversing the .f00000 file format >> branch, rather than the .fld input branch. >> >> This traversal is generally controlled by setting >> param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. >> >> I'm wondering if you are compiling with r1854.usr provided in >> the examples/vortex directory? >> >> Paul >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> >>> Hi Dong, >>> >>> What kind of platform are you running on? >>> >>> Can you do an svn update in the example directory, just in case >>> the file was contaminated on transfer? >>> >>> Paul >>> >>> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> Hi, Stefan. >>>> Sure. Attached is the log file. >>>> >>>> -Dong >>>> >>>> -----Original Message----- >>>> From: nek5000-users-bounces at lists.mcs.anl.gov >>>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>>> nek5000-users at lists.mcs.anl.gov >>>> Sent: Tuesday, October 11, 2011 2:15 PM >>>> To: nek5000-users at lists.mcs.anl.gov >>>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>>> >>>> Please post a full log file. Note the file size is limited to 100KB. >>>> -Stefan >>>> >>>> On 10/11/11, S K wrote: >>>>> Hi Dong, >>>>> >>>>> It works for me. Did you change anything? >>>>> >>>>> -Stefan >>>>> >>>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>>> wrote: >>>>>> Hi, >>>>>> I was trying to run nek5000 with vortex, following the default user >>>>>> setting. >>>>>> There is no problem for building, however when I start to run the >>>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>>> following >>>>>> is the related error report. >>>>>> >>>>>> Reading checkpoint data >>>>>> 0 0 OPEN: r1492_n08.fld27 >>>>>> byte swap: T 0.000000 0.000000 >>>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>>> NELT,NX,N >>>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>>> >>>>>> NONSTD HDR, parse_hdr, abort. >>>>>> >>>>>> I briefly read the code. It looks like the code is reading the >>>>>> checkpoint >>>>>> file "r1492_n08.fld27" and there is something wrong with the file >>>>>> header. >>>>>> The file header does not follow the standard format and the code cannot >>>>>> understand the file. >>>>>> >>>>>> How can I solve this problem? >>>>>> >>>>>> Thank you. >>>>>> >>>>>> -Dong >>>>>> >>>>> >>>> _______________________________________________ >>>> Nek5000-users mailing list >>>> Nek5000-users at lists.mcs.anl.gov >>>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Hi, Stefan. >>> Sure. Attached is the log file. >>> >>> -Dong >>> >>> -----Original Message----- >>> From: nek5000-users-bounces at lists.mcs.anl.gov >>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>> nek5000-users at lists.mcs.anl.gov >>> Sent: Tuesday, October 11, 2011 2:15 PM >>> To: nek5000-users at lists.mcs.anl.gov >>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>> >>> Please post a full log file. Note the file size is limited to 100KB. >>> -Stefan >>> >>> On 10/11/11, S K wrote: >>>> Hi Dong, >>>> >>>> It works for me. Did you change anything? >>>> >>>> -Stefan >>>> >>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>> wrote: >>>>> Hi, >>>>> I was trying to run nek5000 with vortex, following the default user >>>>> setting. >>>>> There is no problem for building, however when I start to run the >>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>> following >>>>> is the related error report. >>>>> >>>>> Reading checkpoint data >>>>> 0 0 OPEN: r1492_n08.fld27 >>>>> byte swap: T 0.000000 0.000000 >>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>> NELT,NX,N >>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>> >>>>> NONSTD HDR, parse_hdr, abort. >>>>> >>>>> I briefly read the code. It looks like the code is reading the >>>>> checkpoint >>>>> file "r1492_n08.fld27" and there is something wrong with the file >>>>> header. >>>>> The file header does not follow the standard format and the code cannot >>>>> understand the file. >>>>> >>>>> How can I solve this problem? >>>>> >>>>> Thank you. >>>>> >>>>> -Dong >>>>> >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 14:58:29 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 21:58:29 +0200 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C83942@EXCHMBB.ornl.gov> Message-ID: Can you post your r1854a.usr . -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > > Hi Dong, > > which fortran compiler are you using ? (also, which C compiler?) > > Paul > > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> >> PS -- >> >> It should look like >> >> >> >> gs_setup: 0 unique labels shared >> done :: setup h1 coarse grid 6.9365978240966797E-002 sec >> >> call usrdat3 >> done :: usrdat3 >> >> set initial conditions >> Checking restart options: r1492_n08.fld27 time=0. >> Reading checkpoint data >> byte swap: F 6.543210 -2.9312772E+35 >> Read mode: 4.000000000000000 neltr,nxr,nyr,nzr: 140 8 8 8 >> >> Restarting from file r1492_n08.fld27 >> Columns for restart data U,V,W,P,T,S,N: 0 2 3 4 0 0 4 >> Reading 1 >> Successfully read data from dump number 1. >> >> >> >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Hi, Paul. >>> Thanks for the reply. >>> Yeah. I was compiling the code in the examples/vortex directory. The >>> difference is that it is the file "r1854a.usr" included, not "r1854.usr". >>> >>> But, even in the file "r1854a.usr", the "param(66)=4" is already well >>> set. >>> I am still trying to figure out why my run follows a different file >>> format. >>> Your further suggestion is very welcome. >>> >>> BTW, my platform is Linux/gnu. >>> >>> Thanks. >>> -Dong >>> >>> >>> >>> -----Original Message----- >>> From: nek5000-users-bounces at lists.mcs.anl.gov >>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>> nek5000-users at lists.mcs.anl.gov >>> Sent: Tuesday, October 11, 2011 3:03 PM >>> To: nek5000-users at lists.mcs.anl.gov >>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>> >>> >>> >>> Hi Dong, >>> >>> For some reason, your run is traversing the .f00000 file format >>> branch, rather than the .fld input branch. >>> >>> This traversal is generally controlled by setting >>> param(66)=4 (for .fld) in usrdat2 (or usrdat) in the .usr file. >>> >>> I'm wondering if you are compiling with r1854.usr provided in >>> the examples/vortex directory? >>> >>> Paul >>> >>> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> >>>> Hi Dong, >>>> >>>> What kind of platform are you running on? >>>> >>>> Can you do an svn update in the example directory, just in case >>>> the file was contaminated on transfer? >>>> >>>> Paul >>>> >>>> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>>> >>>>> Hi, Stefan. >>>>> Sure. Attached is the log file. >>>>> >>>>> -Dong >>>>> >>>>> -----Original Message----- >>>>> From: nek5000-users-bounces at lists.mcs.anl.gov >>>>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>>>> nek5000-users at lists.mcs.anl.gov >>>>> Sent: Tuesday, October 11, 2011 2:15 PM >>>>> To: nek5000-users at lists.mcs.anl.gov >>>>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>>>> >>>>> Please post a full log file. Note the file size is limited to 100KB. >>>>> -Stefan >>>>> >>>>> On 10/11/11, S K wrote: >>>>>> Hi Dong, >>>>>> >>>>>> It works for me. Did you change anything? >>>>>> >>>>>> -Stefan >>>>>> >>>>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>>>> wrote: >>>>>>> Hi, >>>>>>> I was trying to run nek5000 with vortex, following the default user >>>>>>> setting. >>>>>>> There is no problem for building, however when I start to run the >>>>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>>>> following >>>>>>> is the related error report. >>>>>>> >>>>>>> Reading checkpoint data >>>>>>> 0 0 OPEN: r1492_n08.fld27 >>>>>>> byte swap: T 0.000000 0.000000 >>>>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>>>> NELT,NX,N >>>>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>>>> >>>>>>> NONSTD HDR, parse_hdr, abort. >>>>>>> >>>>>>> I briefly read the code. It looks like the code is reading the >>>>>>> checkpoint >>>>>>> file "r1492_n08.fld27" and there is something wrong with the file >>>>>>> header. >>>>>>> The file header does not follow the standard format and the code >>>>>>> cannot >>>>>>> understand the file. >>>>>>> >>>>>>> How can I solve this problem? >>>>>>> >>>>>>> Thank you. >>>>>>> >>>>>>> -Dong >>>>>>> >>>>>> >>>>> _______________________________________________ >>>>> Nek5000-users mailing list >>>>> Nek5000-users at lists.mcs.anl.gov >>>>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>>> >>> >>> >>> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> Hi, Stefan. >>>> Sure. Attached is the log file. >>>> >>>> -Dong >>>> >>>> -----Original Message----- >>>> From: nek5000-users-bounces at lists.mcs.anl.gov >>>> [mailto:nek5000-users-bounces at lists.mcs.anl.gov] On Behalf Of >>>> nek5000-users at lists.mcs.anl.gov >>>> Sent: Tuesday, October 11, 2011 2:15 PM >>>> To: nek5000-users at lists.mcs.anl.gov >>>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>>> >>>> Please post a full log file. Note the file size is limited to 100KB. >>>> -Stefan >>>> >>>> On 10/11/11, S K wrote: >>>>> Hi Dong, >>>>> >>>>> It works for me. Did you change anything? >>>>> >>>>> -Stefan >>>>> >>>>> On 10/11/11, nek5000-users at lists.mcs.anl.gov >>>>> wrote: >>>>>> Hi, >>>>>> I was trying to run nek5000 with vortex, following the default user >>>>>> setting. >>>>>> There is no problem for building, however when I start to run the >>>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>>> following >>>>>> is the related error report. >>>>>> >>>>>> Reading checkpoint data >>>>>> 0 0 OPEN: r1492_n08.fld27 >>>>>> byte swap: T 0.000000 0.000000 >>>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>>> NELT,NX,N >>>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ >>>>>> >>>>>> NONSTD HDR, parse_hdr, abort. >>>>>> >>>>>> I briefly read the code. It looks like the code is reading the >>>>>> checkpoint >>>>>> file "r1492_n08.fld27" and there is something wrong with the file >>>>>> header. >>>>>> The file header does not follow the standard format and the code >>>>>> cannot >>>>>> understand the file. >>>>>> >>>>>> How can I solve this problem? >>>>>> >>>>>> Thank you. >>>>>> >>>>>> -Dong >>>>>> >>>>> >>>> _______________________________________________ >>>> Nek5000-users mailing list >>>> Nek5000-users at lists.mcs.anl.gov >>>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> From nek5000-users at lists.mcs.anl.gov Tue Oct 11 15:03:16 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 15:03:16 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: References: Message-ID: Hi Stefan, What do you mean by "remapping a periodic-side domain in userdat2"? > I build a mesh using genbox with dimensions [0,1]^3. The BC are periodic in x. Then in userdat 2 I remap the (x,y) cross-section from a square to an annulus: call rescale_x(xm1,zero,pi2) call rescale_x(ym1,zero,one) n = nx1*ny1*nz1*nelv ! Turn box into annulus do i=1,n th = xm1(i,1,1,1) r = r0 + (r1-r0)*(1.-ym1(i,1,1,1)) xm1(i,1,1,1) = r*cos(th) ym1(i,1,1,1) = r*sin(th) enddo This gives me an annulus centered on the z-axis with interior radius r0 and exterior radius r1. (I tried centering it on the y-axis instead with the periodic boundary lying in the xy-plane, but the result was the same.) My velocity BC in what are initially the y and z directions are all SYM, so I need to use the stress formulation. This procedure worked in 2D for no-slip or stress-free BC, but in 3D it is not working for stress-free BC. I can visualize the 3D geometry, and the annular cylinder appears to be properly constructed, but the fluid solution blows up. I understand now that my fractional-annulus cases blow up because non-cyclic periodic BC at the theta-ends are incompatible with stress-free walls, but this should not be an issue for the full 2? annulus, even in 3D. Can you think of a reason that non-cyclic periodic BC would work in 2D but not in 3D? Sure, even with IFCYCLIC=F you'll get a result. But is probably not > the one you had in mind, simply because the BCs are different i.e. > periodic vs cyclic. > Ah yes, you`re right that the results are actually different. Thanks a lot, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 15:09:41 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Wed, 12 Oct 2011 01:39:41 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, I've uploaded the log files on the following links: 64 processors with SEMG disabled: https://gist.github.com/1279232 128 processors with SEMG disabled: https://gist.github.com/1279256 256 processors with SEMG disabled: https://gist.github.com/1279237 216 processors with SEMG enabled: https://gist.github.com/1279239 It works with 64 processors and then gets stuck from 128 onwards. Mani On Tue, Oct 11, 2011 at 7:06 PM, wrote: > Doesn't sound like a memory problem given 4GB of memory per core and a > total static data size of ~350MB (according to the output of size). > The size of the executable doesn't matter in this case. > > - Can you post your logfile again (for the case where the SEMG was > disabled). > - What's the lowest number of processors you can reproduce the problem > (try with lx1=4) > > -Stefan > > On 10/11/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Hi Stefan, > > > > Each node has 64 GB of RAM. There are 16 cores in each node. Each core > has > > the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I > > tried running with p43=1 and it still gets stuck. The full specifications > of > > each core are given below: > > > > vendor_id : GenuineIntel > > cpu family : 6 > > model : 15 > > model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz > > stepping : 11 > > cpu MHz : 2933.445 > > cache size : 4096 KB > > physical id : 6 > > siblings : 4 > > core id : 3 > > cpu cores : 4 > > fpu : yes > > fpu_exception : yes > > cpuid level : 10 > > wp : yes > > flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge > mca > > cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm > > constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm > > bogomips : 5866.92 > > clflush size : 64 > > cache_alignment : 64 > > address sizes : 40 bits physical, 48 bits virtual > > > > Thanks, > > Mani > > > > > > On Mon, Oct 10, 2011 at 11:56 PM, > wrote: > > > >> What's the memory size per core? > >> > >> Sure p43=0 is correct if you want to use the multilevel Schwarz > >> solver. Just as a cross check: set p43=1 and try again. > >> > >> On 10/10/11, nek5000-users at lists.mcs.anl.gov > >> wrote: > >> > Hi Stefan, > >> > > >> > The following is the output of 'size nek5000' > >> > > >> > text data bss dec hex filename > >> > 5163006 59896 333337824 338560726 142e06d6 > nek5000 > >> > > >> > > >> > In the .rea file, p43 has been set to 0. > >> > > >> > Mani > >> > > >> _______________________________________________ > >> Nek5000-users mailing list > >> Nek5000-users at lists.mcs.anl.gov > >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >> > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 15:09:56 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 15:09:56 -0500 (CDT) Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: Message-ID: <873927761.87275.1318363796498.JavaMail.root@zimbra.anl.gov> Hi David, I was using a similar transformation for the annulus except for the order of r & theta, i.e. I mapped [r,theta,z] -> [x,y,z] preserving right-handed coordinate system... Aleks ----- Original Message ----- From: nek5000-users at lists.mcs.anl.gov To: nek5000-users at lists.mcs.anl.gov Sent: Tuesday, October 11, 2011 3:03:16 PM Subject: Re: [Nek5000-users] Periodic BC on non-parallel boundaries Hi Stefan, What do you mean by "remapping a periodic-side domain in userdat2"? I build a mesh using genbox with dimensions [0,1]^3. The BC are periodic in x. Then in userdat 2 I remap the (x,y) cross-section from a square to an annulus: call rescale_x(xm1,zero,pi2) call rescale_x(ym1,zero,one) n = nx1*ny1*nz1*nelv ! Turn box into annulus do i=1,n th = xm1(i,1,1,1) r = r0 + (r1-r0)*(1.-ym1(i,1,1,1)) xm1(i,1,1,1) = r*cos(th) ym1(i,1,1,1) = r*sin(th) enddo This gives me an annulus centered on the z-axis with interior radius r0 and exterior radius r1. (I tried centering it on the y-axis instead with the periodic boundary lying in the xy-plane, but the result was the same.) My velocity BC in what are initially the y and z directions are all SYM, so I need to use the stress formulation. This procedure worked in 2D for no-slip or stress-free BC, but in 3D it is not working for stress-free BC. I can visualize the 3D geometry, and the annular cylinder appears to be properly constructed, but the fluid solution blows up. I understand now that my fractional-annulus cases blow up because non-cyclic periodic BC at the theta-ends are incompatible with stress-free walls, but this should not be an issue for the full 2? annulus, even in 3D. Can you think of a reason that non-cyclic periodic BC would work in 2D but not in 3D? Sure, even with IFCYCLIC=F you'll get a result. But is probably not the one you had in mind, simply because the BCs are different i.e. periodic vs cyclic. Ah yes, you`re right that the results are actually different. Thanks a lot, David _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 11 15:23:18 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 22:23:18 +0200 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Thanks Mani. Let me check them and I'll come back to you. We may need to do some heavy debugging sessions. Any chance to get access to that machine? Please contact me off-list to arrange the details. -Stefan On 10/11/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi Stefan, > > I've uploaded the log files on the following links: > > 64 processors with SEMG disabled: https://gist.github.com/1279232 > 128 processors with SEMG disabled: https://gist.github.com/1279256 > 256 processors with SEMG disabled: https://gist.github.com/1279237 > 216 processors with SEMG enabled: https://gist.github.com/1279239 > > It works with 64 processors and then gets stuck from 128 onwards. > > Mani > > On Tue, Oct 11, 2011 at 7:06 PM, wrote: > >> Doesn't sound like a memory problem given 4GB of memory per core and a >> total static data size of ~350MB (according to the output of size). >> The size of the executable doesn't matter in this case. >> >> - Can you post your logfile again (for the case where the SEMG was >> disabled). >> - What's the lowest number of processors you can reproduce the problem >> (try with lx1=4) >> >> -Stefan >> >> On 10/11/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Hi Stefan, >> > >> > Each node has 64 GB of RAM. There are 16 cores in each node. Each core >> has >> > the 4096 KB of cache. The size of the executable 'nek5000' is 5.6 MB. I >> > tried running with p43=1 and it still gets stuck. The full >> > specifications >> of >> > each core are given below: >> > >> > vendor_id : GenuineIntel >> > cpu family : 6 >> > model : 15 >> > model name : Intel(R) Xeon(R) CPU X7350 @ 2.93GHz >> > stepping : 11 >> > cpu MHz : 2933.445 >> > cache size : 4096 KB >> > physical id : 6 >> > siblings : 4 >> > core id : 3 >> > cpu cores : 4 >> > fpu : yes >> > fpu_exception : yes >> > cpuid level : 10 >> > wp : yes >> > flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge >> mca >> > cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall lm >> > constant_tsc pni monitor ds_cpl vmx est tm2 cx16 xtpr lahf_lm >> > bogomips : 5866.92 >> > clflush size : 64 >> > cache_alignment : 64 >> > address sizes : 40 bits physical, 48 bits virtual >> > >> > Thanks, >> > Mani >> > >> > >> > On Mon, Oct 10, 2011 at 11:56 PM, >> wrote: >> > >> >> What's the memory size per core? >> >> >> >> Sure p43=0 is correct if you want to use the multilevel Schwarz >> >> solver. Just as a cross check: set p43=1 and try again. >> >> >> >> On 10/10/11, nek5000-users at lists.mcs.anl.gov >> >> wrote: >> >> > Hi Stefan, >> >> > >> >> > The following is the output of 'size nek5000' >> >> > >> >> > text data bss dec hex filename >> >> > 5163006 59896 333337824 338560726 142e06d6 >> nek5000 >> >> > >> >> > >> >> > In the .rea file, p43 has been set to 0. >> >> > >> >> > Mani >> >> > >> >> _______________________________________________ >> >> Nek5000-users mailing list >> >> Nek5000-users at lists.mcs.anl.gov >> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >> >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > From nek5000-users at lists.mcs.anl.gov Tue Oct 11 15:42:53 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Wed, 12 Oct 2011 02:12:53 +0530 Subject: [Nek5000-users] NEK gets stuck In-Reply-To: References: Message-ID: Hi Stefan, Thank you for taking time to look into this problem. I tried to mail you on your ANL address but it failed with the following error: Delivery to the following recipient failed permanently: stefanke at mcs.anl.gov Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. Is there an alternate email address that I can contact you on? Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 16:47:56 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 16:47:56 -0500 Subject: [Nek5000-users] Error when running the vortex problem In-Reply-To: References: <977E8EBAA2E5E946A9311B0F429730FA6F59C838DC@EXCHMBB.ornl.gov> <977E8EBAA2E5E946A9311B0F429730FA6F59C838FE@EXCHMBB.ornl.gov> Message-ID: Shouldn't it be param(67)? Justin On Tue, Oct 11, 2011 at 2:03 PM, wrote: > > > Hi Dong, > > For some reason, your run is traversing the .f00000 file format > branch, rather than the .fld input branch. > > This traversal is generally controlled by setting param(66)=4 (for .fld) in > usrdat2 (or usrdat) in the .usr file. > > I'm wondering if you are compiling with r1854.usr provided in > the examples/vortex directory? > > > Paul > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.**govwrote: > > >> Hi Dong, >> >> What kind of platform are you running on? >> >> Can you do an svn update in the example directory, just in case >> the file was contaminated on transfer? >> >> Paul >> >> On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.**govwrote: >> >> Hi, Stefan. >>> Sure. Attached is the log file. >>> >>> -Dong >>> >>> -----Original Message----- >>> From: nek5000-users-bounces at lists.**mcs.anl.gov[mailto: >>> nek5000-users-bounces@**lists.mcs.anl.gov] >>> On Behalf Of nek5000-users at lists.mcs.anl.**gov >>> Sent: Tuesday, October 11, 2011 2:15 PM >>> To: nek5000-users at lists.mcs.anl.**gov >>> Subject: Re: [Nek5000-users] Error when running the vortex problem >>> >>> Please post a full log file. Note the file size is limited to 100KB. >>> -Stefan >>> >>> On 10/11/11, S K wrote: >>> >>>> Hi Dong, >>>> >>>> It works for me. Did you change anything? >>>> >>>> -Stefan >>>> >>>> On 10/11/11, nek5000-users at lists.mcs.anl.**gov >>>> > >>>> wrote: >>>> >>>>> Hi, >>>>> I was trying to run nek5000 with vortex, following the default user >>>>> setting. >>>>> There is no problem for building, however when I start to run the >>>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>>> following >>>>> is the related error report. >>>>> >>>>> Reading checkpoint data >>>>> 0 0 OPEN: r1492_n08.fld27 >>>>> byte swap: T 0.000000 0.000000 >>>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>>> NELT,NX,N >>>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^**@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^** >>>>> @^@^@^@^@@^@^@^@^@^@^@^@^@^@^@**^@^@^@^@^@ >>>>> >>>>> NONSTD HDR, parse_hdr, abort. >>>>> >>>>> I briefly read the code. It looks like the code is reading the >>>>> checkpoint >>>>> file "r1492_n08.fld27" and there is something wrong with the file >>>>> header. >>>>> The file header does not follow the standard format and the code cannot >>>>> understand the file. >>>>> >>>>> How can I solve this problem? >>>>> >>>>> Thank you. >>>>> >>>>> -Dong >>>>> >>>>> >>>> ______________________________**_________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.**gov >>> https://lists.mcs.anl.gov/**mailman/listinfo/nek5000-users >>> >> >> > > On Tue, 11 Oct 2011, nek5000-users at lists.mcs.anl.**govwrote: > > Hi, Stefan. >> Sure. Attached is the log file. >> >> -Dong >> >> -----Original Message----- >> From: nek5000-users-bounces at lists.**mcs.anl.gov[mailto: >> nek5000-users-bounces@**lists.mcs.anl.gov] >> On Behalf Of nek5000-users at lists.mcs.anl.**gov >> Sent: Tuesday, October 11, 2011 2:15 PM >> To: nek5000-users at lists.mcs.anl.**gov >> Subject: Re: [Nek5000-users] Error when running the vortex problem >> >> Please post a full log file. Note the file size is limited to 100KB. >> -Stefan >> >> On 10/11/11, S K wrote: >> >>> Hi Dong, >>> >>> It works for me. Did you change anything? >>> >>> -Stefan >>> >>> On 10/11/11, nek5000-users at lists.mcs.anl.**gov >>> > >>> wrote: >>> >>>> Hi, >>>> I was trying to run nek5000 with vortex, following the default user >>>> setting. >>>> There is no problem for building, however when I start to run the >>>> application, there is an error "NONSTD HDR, parse_hdr, abort". The >>>> following >>>> is the related error report. >>>> >>>> Reading checkpoint data >>>> 0 0 OPEN: r1492_n08.fld27 >>>> byte swap: T 0.000000 0.000000 >>>> 140 8 8 8 5.0101000E+0250101 U P 4 >>>> NELT,NX,N >>>> ??a?@^@^@^@^@^@^@^@^@^@^@^@^@^**@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^** >>>> @^@^@^@^@@^@^@^@^@^@^@^@^@^@^@**^@^@^@^@^@ >>>> >>>> NONSTD HDR, parse_hdr, abort. >>>> >>>> I briefly read the code. It looks like the code is reading the >>>> checkpoint >>>> file "r1492_n08.fld27" and there is something wrong with the file >>>> header. >>>> The file header does not follow the standard format and the code cannot >>>> understand the file. >>>> >>>> How can I solve this problem? >>>> >>>> Thank you. >>>> >>>> -Dong >>>> >>>> >>> ______________________________**_________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.**gov >> https://lists.mcs.anl.gov/**mailman/listinfo/nek5000-users >> > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 11 20:43:18 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 11 Oct 2011 20:43:18 -0500 Subject: [Nek5000-users] Periodic BC on non-parallel boundaries In-Reply-To: <873927761.87275.1318363796498.JavaMail.root@zimbra.anl.gov> References: <873927761.87275.1318363796498.JavaMail.root@zimbra.anl.gov> Message-ID: Hi Aleks, Thanks for the tip. Unfortunately, fixing the handedness did not make the free-slip case work for me. When you were simulating an annular cylinder, did you ever have the SYM BC's on both the sides and the ends? I think I've determined that neither the Pn/Pn-2 method nor the stress formulation is the problem with my 3D annular cylinder. I conclude this because the simulation runs properly with IFSTRS=T when the sides are SYM and ends are W, or when the sides are W and the ends are SYM. It is only when both sides and ends are SYM that the simulation doesn't work. Any thoughts? Best, David On Tue, Oct 11, 2011 at 3:09 PM, wrote: > Hi David, > > I was using a similar transformation for the annulus except for the order > of r & theta, i.e. I mapped [r,theta,z] -> [x,y,z] preserving right-handed > coordinate system... > > Aleks -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Sun Oct 16 08:18:34 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 16 Oct 2011 13:18:34 +0000 Subject: [Nek5000-users] Problem with CHT: fluxes at the interface do not match Message-ID: Dear all, I was running a simple case with conjugate heat transfer. I managed to validate it against some analytic data. I checked whether the heat fluxes at the interface between solid and fluid match or not. For any ratio of solid conductivity over fluid conductivity they match very well especially if the grid is very refined at the interface. The problem is that they match only if vtrans of solid is the same as vtrans of the fluid, else the ratio of the two fluxes is proportional to the ratio of vtrans, which is wrong. I had a look into the code even though I am not very expert. I am using cvode. In fcvfun there is a call to makeq. In makeq, after the call to wlaplacian and add2, the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, there is a call to dssum over ydot. I am not sure but I think that dssum mixes the properties of fluid and solid, therefore the transient equation which is solved for the temperature at the interface has a flux which is divided by vtrans of the fluid and another flux which is divided by vtrans of the solid, therefore the two fluxes can not match at the interface unless solid and fluid have the same vtrans. If this is really the problem, I think it could be solved by dividing bq at the interface by vtrans of the solid only, before dssum is performed. I'll check again my case to be sure that what I observed was not due to an error. Thanks in advance for your help. Regards, Andrea. From nek5000-users at lists.mcs.anl.gov Sun Oct 16 10:21:42 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 16 Oct 2011 10:21:42 -0500 (CDT) Subject: [Nek5000-users] conj. ht. transfer In-Reply-To: References: Message-ID: Dear Andrea, Thank you for your careful investigation of this issue. We'll dig into it in the next 36 hours and hopefully we will get the problem resolved quickly. Best regards, Paul On Sun, 16 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Dear all, > > I was running a simple case with conjugate heat transfer. I managed to > validate it against some analytic data. I checked whether the heat fluxes at > the interface between solid and fluid match or not. For any ratio of solid > conductivity over fluid conductivity they match very well especially if the > grid is very refined at the interface. The problem is that they match only if > vtrans of solid is the same as vtrans of the fluid, else the ratio of the two > fluxes is proportional to the ratio of vtrans, which is wrong. I had a look > into the code even though I am not very expert. I am using cvode. In fcvfun > there is a call to makeq. In makeq, after the call to wlaplacian and add2, > the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, there > is a call to dssum over ydot. I am not sure but I think that dssum mixes the > properties of fluid and solid, therefore the transient equation which is > solved for the temperature at the interface has a flux which is divided by > vtrans of the fluid and another flux which is divided by vtrans of the solid, > therefore the two fluxes can not match at the interface unless solid and > fluid have the same vtrans. If this is really the problem, I think it could > be solved by dividing bq at the interface by vtrans of the solid only, before > dssum is performed. I'll check again my case to be sure that what I observed > was not due to an error. > > Thanks in advance for your help. > > Regards, Andrea. _______________________________________________ > Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Sun Oct 16 10:30:46 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 16 Oct 2011 17:30:46 +0200 Subject: [Nek5000-users] conj. ht. transfer In-Reply-To: References: Message-ID: Hmm, that puts some pressure on me. Investigation starts now :) Cheers, Stefan On 10/16/11, nek5000-users at lists.mcs.anl.gov wrote: > > Dear Andrea, > > Thank you for your careful investigation of this issue. > We'll dig into it in the next 36 hours and hopefully we > will get the problem resolved quickly. > > Best regards, > > Paul > > > > > On Sun, 16 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > >> Dear all, >> >> I was running a simple case with conjugate heat transfer. I managed to >> validate it against some analytic data. I checked whether the heat fluxes >> at >> the interface between solid and fluid match or not. For any ratio of solid >> conductivity over fluid conductivity they match very well especially if >> the >> grid is very refined at the interface. The problem is that they match only >> if >> vtrans of solid is the same as vtrans of the fluid, else the ratio of the >> two >> fluxes is proportional to the ratio of vtrans, which is wrong. I had a >> look >> into the code even though I am not very expert. I am using cvode. In >> fcvfun >> there is a call to makeq. In makeq, after the call to wlaplacian and add2, >> the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, >> there >> is a call to dssum over ydot. I am not sure but I think that dssum mixes >> the >> properties of fluid and solid, therefore the transient equation which is >> solved for the temperature at the interface has a flux which is divided by >> vtrans of the fluid and another flux which is divided by vtrans of the >> solid, >> therefore the two fluxes can not match at the interface unless solid and >> fluid have the same vtrans. If this is really the problem, I think it >> could >> be solved by dividing bq at the interface by vtrans of the solid only, >> before >> dssum is performed. I'll check again my case to be sure that what I >> observed >> was not due to an error. >> >> Thanks in advance for your help. >> >> Regards, Andrea. _______________________________________________ >> Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Sun Oct 16 10:50:09 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 16 Oct 2011 17:50:09 +0200 Subject: [Nek5000-users] conj. ht. transfer In-Reply-To: References: Message-ID: Andrea, Let's start with an important note: The implement SEM is locally not conservative. Thus, the fluxes do not need to match. However, it would be suspicious if the flux conservation at the interface is proportional to something! I'll have a look at it. Stefan On 10/16/11, S K wrote: > Hmm, that puts some pressure on me. Investigation starts now :) > Cheers, > Stefan > > On 10/16/11, nek5000-users at lists.mcs.anl.gov > wrote: >> >> Dear Andrea, >> >> Thank you for your careful investigation of this issue. >> We'll dig into it in the next 36 hours and hopefully we >> will get the problem resolved quickly. >> >> Best regards, >> >> Paul >> >> >> >> >> On Sun, 16 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >> >>> Dear all, >>> >>> I was running a simple case with conjugate heat transfer. I managed to >>> validate it against some analytic data. I checked whether the heat >>> fluxes >>> at >>> the interface between solid and fluid match or not. For any ratio of >>> solid >>> conductivity over fluid conductivity they match very well especially if >>> the >>> grid is very refined at the interface. The problem is that they match >>> only >>> if >>> vtrans of solid is the same as vtrans of the fluid, else the ratio of >>> the >>> two >>> fluxes is proportional to the ratio of vtrans, which is wrong. I had a >>> look >>> into the code even though I am not very expert. I am using cvode. In >>> fcvfun >>> there is a call to makeq. In makeq, after the call to wlaplacian and >>> add2, >>> the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, >>> there >>> is a call to dssum over ydot. I am not sure but I think that dssum mixes >>> the >>> properties of fluid and solid, therefore the transient equation which is >>> solved for the temperature at the interface has a flux which is divided >>> by >>> vtrans of the fluid and another flux which is divided by vtrans of the >>> solid, >>> therefore the two fluxes can not match at the interface unless solid and >>> fluid have the same vtrans. If this is really the problem, I think it >>> could >>> be solved by dividing bq at the interface by vtrans of the solid only, >>> before >>> dssum is performed. I'll check again my case to be sure that what I >>> observed >>> was not due to an error. >>> >>> Thanks in advance for your help. >>> >>> Regards, Andrea. _______________________________________________ >>> Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > From nek5000-users at lists.mcs.anl.gov Sun Oct 16 11:01:10 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sun, 16 Oct 2011 18:01:10 +0200 Subject: [Nek5000-users] conj. ht. transfer In-Reply-To: References: Message-ID: Can you post the code you have used to validate the interface condition? Stefan On 10/16/11, S K wrote: > Andrea, > > Let's start with an important note: The implement SEM is locally not > conservative. Thus, the fluxes do not need to match. However, it would > be suspicious if the flux conservation at the interface is > proportional to something! > > I'll have a look at it. > Stefan > > On 10/16/11, S K wrote: >> Hmm, that puts some pressure on me. Investigation starts now :) >> Cheers, >> Stefan >> >> On 10/16/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> >>> Dear Andrea, >>> >>> Thank you for your careful investigation of this issue. >>> We'll dig into it in the next 36 hours and hopefully we >>> will get the problem resolved quickly. >>> >>> Best regards, >>> >>> Paul >>> >>> >>> >>> >>> On Sun, 16 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> Dear all, >>>> >>>> I was running a simple case with conjugate heat transfer. I managed to >>>> validate it against some analytic data. I checked whether the heat >>>> fluxes >>>> at >>>> the interface between solid and fluid match or not. For any ratio of >>>> solid >>>> conductivity over fluid conductivity they match very well especially if >>>> the >>>> grid is very refined at the interface. The problem is that they match >>>> only >>>> if >>>> vtrans of solid is the same as vtrans of the fluid, else the ratio of >>>> the >>>> two >>>> fluxes is proportional to the ratio of vtrans, which is wrong. I had a >>>> look >>>> into the code even though I am not very expert. I am using cvode. In >>>> fcvfun >>>> there is a call to makeq. In makeq, after the call to wlaplacian and >>>> add2, >>>> the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, >>>> there >>>> is a call to dssum over ydot. I am not sure but I think that dssum >>>> mixes >>>> the >>>> properties of fluid and solid, therefore the transient equation which >>>> is >>>> solved for the temperature at the interface has a flux which is divided >>>> by >>>> vtrans of the fluid and another flux which is divided by vtrans of the >>>> solid, >>>> therefore the two fluxes can not match at the interface unless solid >>>> and >>>> fluid have the same vtrans. If this is really the problem, I think it >>>> could >>>> be solved by dividing bq at the interface by vtrans of the solid only, >>>> before >>>> dssum is performed. I'll check again my case to be sure that what I >>>> observed >>>> was not due to an error. >>>> >>>> Thanks in advance for your help. >>>> >>>> Regards, Andrea. _______________________________________________ >>>> Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov >>>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> > From nek5000-users at lists.mcs.anl.gov Mon Oct 17 08:05:18 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Mon, 17 Oct 2011 13:05:18 +0000 Subject: [Nek5000-users] conj. ht. transfer In-Reply-To: References: , Message-ID: Dear Paul and Stefan, thanks a lot for your help. Sorry for the late reply. I checked many times and I have always obtained the same result. The heat fluxes match very well for equal vtrans but they don't for different vtrans. I think that from a physical point of view the fluxes have to match always. Stefan, I'll send you the case with an e-mail. Thanks again. Regards, Andrea. ________________________________________ From: nek5000-users-bounces at lists.mcs.anl.gov [nek5000-users-bounces at lists.mcs.anl.gov] on behalf of nek5000-users at lists.mcs.anl.gov [nek5000-users at lists.mcs.anl.gov] Sent: Sunday, October 16, 2011 6:01 PM To: nek5000-users at lists.mcs.anl.gov Subject: Re: [Nek5000-users] conj. ht. transfer Can you post the code you have used to validate the interface condition? Stefan On 10/16/11, S K wrote: > Andrea, > > Let's start with an important note: The implement SEM is locally not > conservative. Thus, the fluxes do not need to match. However, it would > be suspicious if the flux conservation at the interface is > proportional to something! > > I'll have a look at it. > Stefan > > On 10/16/11, S K wrote: >> Hmm, that puts some pressure on me. Investigation starts now :) >> Cheers, >> Stefan >> >> On 10/16/11, nek5000-users at lists.mcs.anl.gov >> wrote: >>> >>> Dear Andrea, >>> >>> Thank you for your careful investigation of this issue. >>> We'll dig into it in the next 36 hours and hopefully we >>> will get the problem resolved quickly. >>> >>> Best regards, >>> >>> Paul >>> >>> >>> >>> >>> On Sun, 16 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: >>> >>>> Dear all, >>>> >>>> I was running a simple case with conjugate heat transfer. I managed to >>>> validate it against some analytic data. I checked whether the heat >>>> fluxes >>>> at >>>> the interface between solid and fluid match or not. For any ratio of >>>> solid >>>> conductivity over fluid conductivity they match very well especially if >>>> the >>>> grid is very refined at the interface. The problem is that they match >>>> only >>>> if >>>> vtrans of solid is the same as vtrans of the fluid, else the ratio of >>>> the >>>> two >>>> fluxes is proportional to the ratio of vtrans, which is wrong. I had a >>>> look >>>> into the code even though I am not very expert. I am using cvode. In >>>> fcvfun >>>> there is a call to makeq. In makeq, after the call to wlaplacian and >>>> add2, >>>> the rhs, which is bq, is divided by vtrans. When we are back to fcvfun, >>>> there >>>> is a call to dssum over ydot. I am not sure but I think that dssum >>>> mixes >>>> the >>>> properties of fluid and solid, therefore the transient equation which >>>> is >>>> solved for the temperature at the interface has a flux which is divided >>>> by >>>> vtrans of the fluid and another flux which is divided by vtrans of the >>>> solid, >>>> therefore the two fluxes can not match at the interface unless solid >>>> and >>>> fluid have the same vtrans. If this is really the problem, I think it >>>> could >>>> be solved by dividing bq at the interface by vtrans of the solid only, >>>> before >>>> dssum is performed. I'll check again my case to be sure that what I >>>> observed >>>> was not due to an error. >>>> >>>> Thanks in advance for your help. >>>> >>>> Regards, Andrea. _______________________________________________ >>>> Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov >>>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>>> >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> > _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 18 11:05:33 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 18:05:33 +0200 Subject: [Nek5000-users] Running Simulations in Parallel with nekbmpi In-Reply-To: References: Message-ID: Hi, Sorry for this very late response, it completely got out of my mind. Here are the different outputs I get for these commands: jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 31533 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 31533 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 text data bss dec hex filename 2232834 3484 185835044 188071362 b35bdc2 nek5000 jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE C Dimension file to be included C C HCUBE array dimensions C parameter (ldim=2) parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) parameter (lxd=17,lyd=lxd,lzd=1) parameter (lelx=20,lely=60,lelz=1) c parameter (lzl=3 + 2*(ldim-3)) c parameter (lx2=lx1-2) parameter (ly2=ly1-2) parameter (lz2=1) parameter (lx3=lx2) parameter (ly3=ly2) parameter (lz3=lz2) c c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) c parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes parameter (lpx2=1,lpy2=1,lpz2=1) c c c parameter (lbelv=lelv,lbelt=lelt) ! MHD c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) c parameter (lbelv=1,lbelt=1) ! MHD parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes parameter (lbx2=1,lby2=1,lbz2=1) c C LX1M=LX1 when there are moving meshes; =1 otherwise parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) parameter (ldimt= 4) ! 3 passive scalars + T parameter (ldimt1=ldimt+1) parameter (ldimt3=ldimt+3) parameter (lp = 8) parameter (lelg = 300) c c Note: In the new code, LELGEC should be about sqrt(LELG) c PARAMETER (LELGEC = 1) PARAMETER (LXYZ2 = 1) PARAMETER (LXZ21 = 1) c PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) PARAMETER (LXZ=LX1*LZ1) PARAMETER (LORDER=3) PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) C C Common Block Dimensions C PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) C C The parameter LVEC controls whether an additional 42 field arrays C are required for Steady State Solutions. If you are not using C Steady State, it is recommended that LVEC=1. C PARAMETER (LVEC=1) C C Uzawa projection array dimensions C parameter (mxprev = 80) parameter (lgmres = 40) C C Split projection array dimensions C parameter(lmvec = 1) parameter(lsvec = 1) parameter(lstore=lmvec*lsvec) c c NONCONFORMING STUFF c parameter (maxmor = lelt) C C Array dimensions C COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID $,NXD,NYD,NZD c automatically added by makenek parameter(lxo = lx1) ! max output grid size (lxo>=lx1) c automatically added by makenek parameter(lpart = 1 ) ! max number of particles c automatically added by makenek integer ax1,ay1,az1,ax2,ay2,az2 parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! running averages c automatically added by makenek parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure Preconditioner c automatically added by makenek parameter (lfdm=0) ! == 1 for fast diagonalization method Concerning Nek, yes I do get an output on the screen when lauching the calculation. Attached is the logfile for one example (2D cavity, only with 100 elements though but the problem is the same as I increase the number of elements). Cheers, On 18 October 2011 18:03, Jean-Christophe Loiseau wrote: > Hi, > > Sorry for this very late response, it completely got out of my mind. Here > are the different outputs I get for these commands: > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > core file size (blocks, -c) 0 > data seg size (kbytes, -d) unlimited > scheduling priority (-e) 0 > file size (blocks, -f) unlimited > pending signals (-i) 31533 > max locked memory (kbytes, -l) 64 > max memory size (kbytes, -m) unlimited > open files (-n) 1024 > pipe size (512 bytes, -p) 8 > POSIX message queues (bytes, -q) 819200 > real-time priority (-r) 0 > stack size (kbytes, -s) 8192 > cpu time (seconds, -t) unlimited > max user processes (-u) 31533 > virtual memory (kbytes, -v) unlimited > file locks (-x) unlimited > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > text data bss dec hex filename > 2232834 3484 185835044 188071362 b35bdc2 nek5000 > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > C Dimension file to be included > C > C HCUBE array dimensions > C > parameter (ldim=2) > parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > parameter (lxd=17,lyd=lxd,lzd=1) > parameter (lelx=20,lely=60,lelz=1) > c > parameter (lzl=3 + 2*(ldim-3)) > c > parameter (lx2=lx1-2) > parameter (ly2=ly1-2) > parameter (lz2=1) > parameter (lx3=lx2) > parameter (ly3=ly2) > parameter (lz3=lz2) > c > c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > c > parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > parameter (lpx2=1,lpy2=1,lpz2=1) > c > c > c parameter (lbelv=lelv,lbelt=lelt) ! MHD > c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > c > parameter (lbelv=1,lbelt=1) ! MHD > parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > parameter (lbx2=1,lby2=1,lbz2=1) > c > C LX1M=LX1 when there are moving meshes; =1 otherwise > parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > parameter (ldimt= 4) ! 3 passive scalars + T > parameter (ldimt1=ldimt+1) > parameter (ldimt3=ldimt+3) > parameter (lp = 8) > parameter (lelg = 300) > c > c Note: In the new code, LELGEC should be about sqrt(LELG) > c > PARAMETER (LELGEC = 1) > PARAMETER (LXYZ2 = 1) > PARAMETER (LXZ21 = 1) > c > PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > PARAMETER (LXZ=LX1*LZ1) > PARAMETER (LORDER=3) > PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > C > C Common Block Dimensions > C > PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > C > C The parameter LVEC controls whether an additional 42 field arrays > C are required for Steady State Solutions. If you are not using > C Steady State, it is recommended that LVEC=1. > C > PARAMETER (LVEC=1) > C > C Uzawa projection array dimensions > C > parameter (mxprev = 80) > parameter (lgmres = 40) > C > C Split projection array dimensions > C > parameter(lmvec = 1) > parameter(lsvec = 1) > parameter(lstore=lmvec*lsvec) > c > c NONCONFORMING STUFF > c > parameter (maxmor = lelt) > C > C Array dimensions > C > COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > $,NXD,NYD,NZD > > c automatically added by makenek > parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > > c automatically added by makenek > parameter(lpart = 1 ) ! max number of particles > > c automatically added by makenek > integer ax1,ay1,az1,ax2,ay2,az2 > parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! running > averages > > c automatically added by makenek > parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > Preconditioner > > c automatically added by makenek > parameter (lfdm=0) ! == 1 for fast diagonalization method > > Concerning Nek, yes I do get an output on the screen when lauching the > calculation. Attached is the logfile for one example (2D cavity, only with > 100 elements though but the problem is the same as I increase the number of > elements). > > Cheers, > > > On 27 September 2011 17:40, wrote: > >> Hi, >> >> Do you get any output printed to the screen when you launch Nek? >> Please provide the output of the following command: >> ulimit -a (in case you are using bash) >> size nek5000 >> cat SIZE >> >> Can you post the command you are using to launch Nek in parallel? >> >> Cheers, >> Stefan >> >> >> >> On 9/27/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Hi Neks, >> > >> > I do get the same error when I try to run nek on my laptop. Did you find >> any >> > way to solve the problem John? >> > >> > Regards, >> > >> > On 11 May 2011 21:25, wrote: >> > >> >> Hi, >> >> >> >> I have lp=512 in SIZE, and I did compile with IFMPI=true. I recompiled >> >> yesterday because I thought this may be the case, but I was still >> getting >> >> the same error. When I first start a simulation and run top, it shows >> >> four >> >> processors running nek5000, but this changes to one processor after a >> few >> >> seconds. >> >> >> >> Thanks, >> >> >> >> John >> >> >> >> _______________________________________________ >> >> Nek5000-users mailing list >> >> Nek5000-users at lists.mcs.anl.gov >> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >> >> >> >> > >> > >> > -- >> > Jean-Christophe >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > > > -- > Jean-Christophe > -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 18 11:09:33 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 18:09:33 +0200 Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. Message-ID: Hi Nek's, I have been trying to use the perturbation mode this morning, and I get the following error: /----------------------------------------------------------\\ | _ __ ______ __ __ ______ ____ ____ ____ | | / | / // ____// //_/ / ____/ / __ \\ / __ \\ / __ \\ | | / |/ // __/ / ,< /___ \\ / / / // / / // / / / | | / /| // /___ / /| | ____/ / / /_/ // /_/ // /_/ / | | /_/ |_//_____//_/ |_|/_____/ \\____/ \\____/ \\____/ | | | |----------------------------------------------------------| | | | NEK5000: Open Source Spectral Element Solver | | COPYRIGHT (c) 2008-2010 UCHICAGO ARGONNE, LLC | | Version: 1.0rc1 / SVN r714 | | Web: http://nek5000.mcs.anl.gov | | | \\----------------------------------------------------------/ Number of processors: 1 REAL wdsize : 8 INTEGER wdsize : 4 Beginning session: /home/jean-christophe/Calculs/Cavite2D/Stability/cav.rea timer accuracy: 0.0000000E+00 sec read .rea file nelgt/nelgv/lelt: 100 100 100 lx1 /lx2 /lx3 : 11 9 9 mapping elements to processors 0 100 100 100 100 NELV RANK 0 IEG 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 element load imbalance: 0 100 100 done :: mapping elements to processors ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. call exitt: dying ... backtrace(): obtained 3 stack frames. ./nek5000() [0x821eb0d] ./nek5000() [0x833f6c6] [0xc] total elapsed time : 0.00000E+00 sec total solver time incl. I/O : 0.00000E+00 sec time/timestep : 0.00000E+00 sec CPU seconds/timestep/gridpt : 0.00000E+00 sec Not sure though where it comes from since it's been the very first time I encounter it. My nek version is 714 (the last one I guess). Any hint? Regards, -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 18 11:14:40 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 11:14:40 -0500 (CDT) Subject: [Nek5000-users] =?utf-8?q?ERROR_READING_INITIAL_CONDITION/DRIVE_F?= =?utf-8?q?ORCE_DATA=09ABORTING_IN_ROUTINE_RDICDF=2E?= In-Reply-To: References: Message-ID: Jean-Christophe, Can you send your .rea / .usr / and SIZE file to me off-list? I'll take a look. Paul On Tue, 18 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi Nek's, > > I have been trying to use the perturbation mode this morning, and I get the > following error: > > /----------------------------------------------------------\\ > | _ __ ______ __ __ ______ ____ ____ ____ | > | / | / // ____// //_/ / ____/ / __ \\ / __ \\ / __ \\ | > | / |/ // __/ / ,< /___ \\ / / / // / / // / / / | > | / /| // /___ / /| | ____/ / / /_/ // /_/ // /_/ / | > | /_/ |_//_____//_/ |_|/_____/ \\____/ \\____/ \\____/ | > | | > |----------------------------------------------------------| > | | > | NEK5000: Open Source Spectral Element Solver | > | COPYRIGHT (c) 2008-2010 UCHICAGO ARGONNE, LLC | > | Version: 1.0rc1 / SVN r714 | > | Web: http://nek5000.mcs.anl.gov | > | | > \\----------------------------------------------------------/ > > > Number of processors: 1 > REAL wdsize : 8 > INTEGER wdsize : 4 > > > Beginning session: > > /home/jean-christophe/Calculs/Cavite2D/Stability/cav.rea > > > > timer accuracy: 0.0000000E+00 sec > > read .rea file > nelgt/nelgv/lelt: 100 100 100 > lx1 /lx2 /lx3 : 11 9 9 > > mapping elements to processors > 0 100 100 100 100 NELV > RANK 0 IEG 1 2 3 4 5 6 > 7 8 > 9 10 11 12 13 14 15 > 16 > 17 18 19 20 21 22 23 > 24 > 25 26 27 28 29 30 31 > 32 > 33 34 35 36 37 38 39 > 40 > 41 42 43 44 45 46 47 > 48 > 49 50 51 52 53 54 55 > 56 > 57 58 59 60 61 62 63 > 64 > 65 66 67 68 69 70 71 > 72 > 73 74 75 76 77 78 79 > 80 > 81 82 83 84 85 86 87 > 88 > 89 90 91 92 93 94 95 > 96 > 97 98 99 100 > element load imbalance: 0 100 100 > done :: mapping elements to processors > > ERROR READING INITIAL CONDITION/DRIVE FORCE DATA > ABORTING IN ROUTINE RDICDF. > > call exitt: dying ... > > backtrace(): obtained 3 stack frames. > ./nek5000() [0x821eb0d] > ./nek5000() [0x833f6c6] > [0xc] > > total elapsed time : 0.00000E+00 sec > total solver time incl. I/O : 0.00000E+00 sec > time/timestep : 0.00000E+00 sec > CPU seconds/timestep/gridpt : 0.00000E+00 sec > > Not sure though where it comes from since it's been the very first time I > encounter it. My nek version is 714 (the last one I guess). > Any hint? > > Regards, > > -- > Jean-Christophe > From nek5000-users at lists.mcs.anl.gov Tue Oct 18 11:17:36 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 11:17:36 -0500 (CDT) Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. In-Reply-To: Message-ID: <1162758646.106023.1318954656125.JavaMail.root@zimbra.anl.gov> Hi Jean-Christophe, Usually this error means that something wrong with the format in .rea file, e.g., certain lines are missing, like BCs, Restart, etc. The most common error with the same message I make is setting 0 PRESOLVE/RESTART OPTIONS ***** and giving a filename for the restart. Best, Aleks ----- Original Message ----- From: nek5000-users at lists.mcs.anl.gov To: "Nek 5000" Sent: Tuesday, October 18, 2011 11:09:33 AM Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. Hi Nek's, I have been trying to use the perturbation mode this morning, and I get the following error: /----------------------------------------------------------\\ | _ __ ______ __ __ ______ ____ ____ ____ | | / | / // ____// //_/ / ____/ / __ \\ / __ \\ / __ \\ | | / |/ // __/ / ,< /___ \\ / / / // / / // / / / | | / /| // /___ / /| | ____/ / / /_/ // /_/ // /_/ / | | /_/ |_//_____//_/ |_|/_____/ \\____/ \\____/ \\____/ | | | |----------------------------------------------------------| | | | NEK5000: Open Source Spectral Element Solver | | COPYRIGHT (c) 2008-2010 UCHICAGO ARGONNE, LLC | | Version: 1.0rc1 / SVN r714 | | Web: http://nek5000.mcs.anl.gov | | | \\----------------------------------------------------------/ Number of processors: 1 REAL wdsize : 8 INTEGER wdsize : 4 Beginning session: /home/jean-christophe/Calculs/Cavite2D/Stability/cav.rea timer accuracy: 0.0000000E+00 sec read .rea file nelgt/nelgv/lelt: 100 100 100 lx1 /lx2 /lx3 : 11 9 9 mapping elements to processors 0 100 100 100 100 NELV RANK 0 IEG 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 element load imbalance: 0 100 100 done :: mapping elements to processors ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. call exitt: dying ... backtrace(): obtained 3 stack frames. ./nek5000() [0x821eb0d] ./nek5000() [0x833f6c6] [0xc] total elapsed time : 0.00000E+00 sec total solver time incl. I/O : 0.00000E+00 sec time/timestep : 0.00000E+00 sec CPU seconds/timestep/gridpt : 0.00000E+00 sec Not sure though where it comes from since it's been the very first time I encounter it. My nek version is 714 (the last one I guess). Any hint? Regards, -- Jean-Christophe _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 18 11:19:44 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 18:19:44 +0200 Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. In-Reply-To: <1162758646.106023.1318954656125.JavaMail.root@zimbra.anl.gov> References: <1162758646.106023.1318954656125.JavaMail.root@zimbra.anl.gov> Message-ID: Oops... I really do feel like an idiot right now :) Thanks a lot On 18 October 2011 18:17, wrote: > Hi Jean-Christophe, > > Usually this error means that something wrong with the format in .rea file, > e.g., certain lines are missing, like BCs, Restart, etc. > > The most common error with the same message I make is setting > > 0 PRESOLVE/RESTART OPTIONS ***** > > and giving a filename for the restart. > > Best, > Aleks > > > > > > ----- Original Message ----- > From: nek5000-users at lists.mcs.anl.gov > To: "Nek 5000" > Sent: Tuesday, October 18, 2011 11:09:33 AM > Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA > ABORTING IN ROUTINE RDICDF. > > > Hi Nek's, > > I have been trying to use the perturbation mode this morning, and I get the > following error: > > > /----------------------------------------------------------\\ > | _ __ ______ __ __ ______ ____ ____ ____ | > | / | / // ____// //_/ / ____/ / __ \\ / __ \\ / __ \\ | > | / |/ // __/ / ,< /___ \\ / / / // / / // / / / | > | / /| // /___ / /| | ____/ / / /_/ // /_/ // /_/ / | > | /_/ |_//_____//_/ |_|/_____/ \\____/ \\____/ \\____/ | > | | > |----------------------------------------------------------| > | | > | NEK5000: Open Source Spectral Element Solver | > | COPYRIGHT (c) 2008-2010 UCHICAGO ARGONNE, LLC | > | Version: 1.0rc1 / SVN r714 | > | Web: http://nek5000.mcs.anl.gov | > | | > \\----------------------------------------------------------/ > > > Number of processors: 1 > REAL wdsize : 8 > INTEGER wdsize : 4 > > > Beginning session: > /home/jean-christophe/Calculs/Cavite2D/Stability/cav.rea > > > timer accuracy: 0.0000000E+00 sec > > read .rea file > nelgt/nelgv/lelt: 100 100 100 > lx1 /lx2 /lx3 : 11 9 9 > > mapping elements to processors > 0 100 100 100 100 NELV > RANK 0 IEG 1 2 3 4 5 6 7 8 > 9 10 11 12 13 14 15 16 > 17 18 19 20 21 22 23 24 > 25 26 27 28 29 30 31 32 > 33 34 35 36 37 38 39 40 > 41 42 43 44 45 46 47 48 > 49 50 51 52 53 54 55 56 > 57 58 59 60 61 62 63 64 > 65 66 67 68 69 70 71 72 > 73 74 75 76 77 78 79 80 > 81 82 83 84 85 86 87 88 > 89 90 91 92 93 94 95 96 > 97 98 99 100 > element load imbalance: 0 100 100 > done :: mapping elements to processors > > ERROR READING INITIAL CONDITION/DRIVE FORCE DATA > ABORTING IN ROUTINE RDICDF. > > call exitt: dying ... > > backtrace(): obtained 3 stack frames. > ./nek5000() [0x821eb0d] > ./nek5000() [0x833f6c6] > [0xc] > > total elapsed time : 0.00000E+00 sec > total solver time incl. I/O : 0.00000E+00 sec > time/timestep : 0.00000E+00 sec > CPU seconds/timestep/gridpt : 0.00000E+00 sec > > Not sure though where it comes from since it's been the very first time I > encounter it. My nek version is 714 (the last one I guess). > Any hint? > > Regards, > > -- > Jean-Christophe > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 18 12:13:14 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 19:13:14 +0200 Subject: [Nek5000-users] Running Simulations in Parallel with nekbmpi In-Reply-To: References: Message-ID: Please post your makenek. -Stefan On 10/18/11, nek5000-users at lists.mcs.anl.gov wrote: > Hi, > > Sorry for this very late response, it completely got out of my mind. Here > are the different outputs I get for these commands: > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > core file size (blocks, -c) 0 > data seg size (kbytes, -d) unlimited > scheduling priority (-e) 0 > file size (blocks, -f) unlimited > pending signals (-i) 31533 > max locked memory (kbytes, -l) 64 > max memory size (kbytes, -m) unlimited > open files (-n) 1024 > pipe size (512 bytes, -p) 8 > POSIX message queues (bytes, -q) 819200 > real-time priority (-r) 0 > stack size (kbytes, -s) 8192 > cpu time (seconds, -t) unlimited > max user processes (-u) 31533 > virtual memory (kbytes, -v) unlimited > file locks (-x) unlimited > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > text data bss dec hex filename > 2232834 3484 185835044 188071362 b35bdc2 nek5000 > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > C Dimension file to be included > C > C HCUBE array dimensions > C > parameter (ldim=2) > parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > parameter (lxd=17,lyd=lxd,lzd=1) > parameter (lelx=20,lely=60,lelz=1) > c > parameter (lzl=3 + 2*(ldim-3)) > c > parameter (lx2=lx1-2) > parameter (ly2=ly1-2) > parameter (lz2=1) > parameter (lx3=lx2) > parameter (ly3=ly2) > parameter (lz3=lz2) > c > c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > c > parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > parameter (lpx2=1,lpy2=1,lpz2=1) > c > c > c parameter (lbelv=lelv,lbelt=lelt) ! MHD > c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > c > parameter (lbelv=1,lbelt=1) ! MHD > parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > parameter (lbx2=1,lby2=1,lbz2=1) > c > C LX1M=LX1 when there are moving meshes; =1 otherwise > parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > parameter (ldimt= 4) ! 3 passive scalars + T > parameter (ldimt1=ldimt+1) > parameter (ldimt3=ldimt+3) > parameter (lp = 8) > parameter (lelg = 300) > c > c Note: In the new code, LELGEC should be about sqrt(LELG) > c > PARAMETER (LELGEC = 1) > PARAMETER (LXYZ2 = 1) > PARAMETER (LXZ21 = 1) > c > PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > PARAMETER (LXZ=LX1*LZ1) > PARAMETER (LORDER=3) > PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > C > C Common Block Dimensions > C > PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > C > C The parameter LVEC controls whether an additional 42 field arrays > C are required for Steady State Solutions. If you are not using > C Steady State, it is recommended that LVEC=1. > C > PARAMETER (LVEC=1) > C > C Uzawa projection array dimensions > C > parameter (mxprev = 80) > parameter (lgmres = 40) > C > C Split projection array dimensions > C > parameter(lmvec = 1) > parameter(lsvec = 1) > parameter(lstore=lmvec*lsvec) > c > c NONCONFORMING STUFF > c > parameter (maxmor = lelt) > C > C Array dimensions > C > COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > $,NXD,NYD,NZD > > c automatically added by makenek > parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > > c automatically added by makenek > parameter(lpart = 1 ) ! max number of particles > > c automatically added by makenek > integer ax1,ay1,az1,ax2,ay2,az2 > parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! running > averages > > c automatically added by makenek > parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > Preconditioner > > c automatically added by makenek > parameter (lfdm=0) ! == 1 for fast diagonalization method > > Concerning Nek, yes I do get an output on the screen when lauching the > calculation. Attached is the logfile for one example (2D cavity, only with > 100 elements though but the problem is the same as I increase the number of > elements). > > Cheers, > > On 18 October 2011 18:03, Jean-Christophe Loiseau > wrote: > >> Hi, >> >> Sorry for this very late response, it completely got out of my mind. Here >> are the different outputs I get for these commands: >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a >> core file size (blocks, -c) 0 >> data seg size (kbytes, -d) unlimited >> scheduling priority (-e) 0 >> file size (blocks, -f) unlimited >> pending signals (-i) 31533 >> max locked memory (kbytes, -l) 64 >> max memory size (kbytes, -m) unlimited >> open files (-n) 1024 >> pipe size (512 bytes, -p) 8 >> POSIX message queues (bytes, -q) 819200 >> real-time priority (-r) 0 >> stack size (kbytes, -s) 8192 >> cpu time (seconds, -t) unlimited >> max user processes (-u) 31533 >> virtual memory (kbytes, -v) unlimited >> file locks (-x) unlimited >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 >> text data bss dec hex filename >> 2232834 3484 185835044 188071362 b35bdc2 nek5000 >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE >> C Dimension file to be included >> C >> C HCUBE array dimensions >> C >> parameter (ldim=2) >> parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) >> parameter (lxd=17,lyd=lxd,lzd=1) >> parameter (lelx=20,lely=60,lelz=1) >> c >> parameter (lzl=3 + 2*(ldim-3)) >> c >> parameter (lx2=lx1-2) >> parameter (ly2=ly1-2) >> parameter (lz2=1) >> parameter (lx3=lx2) >> parameter (ly3=ly2) >> parameter (lz3=lz2) >> c >> c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation >> c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes >> c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) >> c >> parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation >> parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes >> parameter (lpx2=1,lpy2=1,lpz2=1) >> c >> c >> c parameter (lbelv=lelv,lbelt=lelt) ! MHD >> c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes >> c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) >> c >> parameter (lbelv=1,lbelt=1) ! MHD >> parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes >> parameter (lbx2=1,lby2=1,lbz2=1) >> c >> C LX1M=LX1 when there are moving meshes; =1 otherwise >> parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) >> parameter (ldimt= 4) ! 3 passive scalars + T >> parameter (ldimt1=ldimt+1) >> parameter (ldimt3=ldimt+3) >> parameter (lp = 8) >> parameter (lelg = 300) >> c >> c Note: In the new code, LELGEC should be about sqrt(LELG) >> c >> PARAMETER (LELGEC = 1) >> PARAMETER (LXYZ2 = 1) >> PARAMETER (LXZ21 = 1) >> c >> PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) >> PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) >> PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) >> PARAMETER (LXZ=LX1*LZ1) >> PARAMETER (LORDER=3) >> PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) >> C >> C Common Block Dimensions >> C >> PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) >> PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) >> C >> C The parameter LVEC controls whether an additional 42 field arrays >> C are required for Steady State Solutions. If you are not using >> C Steady State, it is recommended that LVEC=1. >> C >> PARAMETER (LVEC=1) >> C >> C Uzawa projection array dimensions >> C >> parameter (mxprev = 80) >> parameter (lgmres = 40) >> C >> C Split projection array dimensions >> C >> parameter(lmvec = 1) >> parameter(lsvec = 1) >> parameter(lstore=lmvec*lsvec) >> c >> c NONCONFORMING STUFF >> c >> parameter (maxmor = lelt) >> C >> C Array dimensions >> C >> COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 >> $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID >> $,NXD,NYD,NZD >> >> c automatically added by makenek >> parameter(lxo = lx1) ! max output grid size (lxo>=lx1) >> >> c automatically added by makenek >> parameter(lpart = 1 ) ! max number of particles >> >> c automatically added by makenek >> integer ax1,ay1,az1,ax2,ay2,az2 >> parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! >> running >> averages >> >> c automatically added by makenek >> parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure >> Preconditioner >> >> c automatically added by makenek >> parameter (lfdm=0) ! == 1 for fast diagonalization method >> >> Concerning Nek, yes I do get an output on the screen when lauching the >> calculation. Attached is the logfile for one example (2D cavity, only with >> 100 elements though but the problem is the same as I increase the number >> of >> elements). >> >> Cheers, >> >> >> On 27 September 2011 17:40, wrote: >> >>> Hi, >>> >>> Do you get any output printed to the screen when you launch Nek? >>> Please provide the output of the following command: >>> ulimit -a (in case you are using bash) >>> size nek5000 >>> cat SIZE >>> >>> Can you post the command you are using to launch Nek in parallel? >>> >>> Cheers, >>> Stefan >>> >>> >>> >>> On 9/27/11, nek5000-users at lists.mcs.anl.gov >>> wrote: >>> > Hi Neks, >>> > >>> > I do get the same error when I try to run nek on my laptop. Did you >>> > find >>> any >>> > way to solve the problem John? >>> > >>> > Regards, >>> > >>> > On 11 May 2011 21:25, wrote: >>> > >>> >> Hi, >>> >> >>> >> I have lp=512 in SIZE, and I did compile with IFMPI=true. I >>> >> recompiled >>> >> yesterday because I thought this may be the case, but I was still >>> getting >>> >> the same error. When I first start a simulation and run top, it shows >>> >> four >>> >> processors running nek5000, but this changes to one processor after a >>> few >>> >> seconds. >>> >> >>> >> Thanks, >>> >> >>> >> John >>> >> >>> >> _______________________________________________ >>> >> Nek5000-users mailing list >>> >> Nek5000-users at lists.mcs.anl.gov >>> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> >>> >> >>> > >>> > >>> > -- >>> > Jean-Christophe >>> > >>> _______________________________________________ >>> Nek5000-users mailing list >>> Nek5000-users at lists.mcs.anl.gov >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >>> >> >> >> >> -- >> Jean-Christophe >> > > > > -- > Jean-Christophe > From nek5000-users at lists.mcs.anl.gov Tue Oct 18 12:17:48 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 19:17:48 +0200 Subject: [Nek5000-users] Running Simulations in Parallel with nekbmpi In-Reply-To: References: Message-ID: Here it is: #!/bin/bash # Nek5000 build config file # (c) 2008,2009,2010 UCHICAGO ARGONNE, LLC # source path SOURCE_ROOT="$HOME/Codes/nek5_svn/trunk/nek" # Fortran compiler F77="mpif77" # C compiler CC="mpicc" # pre-processor symbol list # (set PPLIST=? to get a list of available symbols) #PPLIST="?" # plug-in list PLUGIN_LIST="" # OPTIONAL SETTINGS # ----------------- # enable MPI (default true) #IFMPI="false" # auxilliary files to compile # NOTE: source files have to located in the same directory as makenek # a makefile_usr.inc has to be provided containing the build rules #USR="foo.o" # linking flags #USR_LFLAGS="-L/usr/lib -lfoo" # generic compiler flags #G="-g" # optimization flags #OPT_FLAGS_STD="" #OPT_FLAGS_MAG="" # enable AMG coarse grid solver (default XXT) #IFAMG="true" #IFAMG_DUMP="true" # CVODE path #CVODE_DIR=$HOME/cvode/lib # MOAB/iMESH path #MOAB_DIR="$HOME/moab" ############################################################################### # DONT'T TOUCH WHAT FOLLOWS !!! ############################################################################### # assign version tag mver=1 # overwrite source path with optional 2nd argument if [ -d $2 ] && [ $# -eq 2 ]; then SOURCE_ROOT="$2" echo "change source code directory to: ", $SOURCE_ROOT fi # do some checks and create makefile source $SOURCE_ROOT/makenek.inc # compile make -j4 -f makefile 2>&1 | tee compiler.out exit 0 On 18 October 2011 19:13, wrote: > Please post your makenek. > -Stefan > > On 10/18/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Hi, > > > > Sorry for this very late response, it completely got out of my mind. Here > > are the different outputs I get for these commands: > > > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > > core file size (blocks, -c) 0 > > data seg size (kbytes, -d) unlimited > > scheduling priority (-e) 0 > > file size (blocks, -f) unlimited > > pending signals (-i) 31533 > > max locked memory (kbytes, -l) 64 > > max memory size (kbytes, -m) unlimited > > open files (-n) 1024 > > pipe size (512 bytes, -p) 8 > > POSIX message queues (bytes, -q) 819200 > > real-time priority (-r) 0 > > stack size (kbytes, -s) 8192 > > cpu time (seconds, -t) unlimited > > max user processes (-u) 31533 > > virtual memory (kbytes, -v) unlimited > > file locks (-x) unlimited > > > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > > text data bss dec hex filename > > 2232834 3484 185835044 188071362 b35bdc2 nek5000 > > > > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > > C Dimension file to be included > > C > > C HCUBE array dimensions > > C > > parameter (ldim=2) > > parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > > parameter (lxd=17,lyd=lxd,lzd=1) > > parameter (lelx=20,lely=60,lelz=1) > > c > > parameter (lzl=3 + 2*(ldim-3)) > > c > > parameter (lx2=lx1-2) > > parameter (ly2=ly1-2) > > parameter (lz2=1) > > parameter (lx3=lx2) > > parameter (ly3=ly2) > > parameter (lz3=lz2) > > c > > c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > > c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > > c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > > c > > parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > > parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > > parameter (lpx2=1,lpy2=1,lpz2=1) > > c > > c > > c parameter (lbelv=lelv,lbelt=lelt) ! MHD > > c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > > c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > > c > > parameter (lbelv=1,lbelt=1) ! MHD > > parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > > parameter (lbx2=1,lby2=1,lbz2=1) > > c > > C LX1M=LX1 when there are moving meshes; =1 otherwise > > parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > > parameter (ldimt= 4) ! 3 passive scalars + T > > parameter (ldimt1=ldimt+1) > > parameter (ldimt3=ldimt+3) > > parameter (lp = 8) > > parameter (lelg = 300) > > c > > c Note: In the new code, LELGEC should be about sqrt(LELG) > > c > > PARAMETER (LELGEC = 1) > > PARAMETER (LXYZ2 = 1) > > PARAMETER (LXZ21 = 1) > > c > > PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > > PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > > PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > > PARAMETER (LXZ=LX1*LZ1) > > PARAMETER (LORDER=3) > > PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > > C > > C Common Block Dimensions > > C > > PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > > PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > > C > > C The parameter LVEC controls whether an additional 42 field arrays > > C are required for Steady State Solutions. If you are not using > > C Steady State, it is recommended that LVEC=1. > > C > > PARAMETER (LVEC=1) > > C > > C Uzawa projection array dimensions > > C > > parameter (mxprev = 80) > > parameter (lgmres = 40) > > C > > C Split projection array dimensions > > C > > parameter(lmvec = 1) > > parameter(lsvec = 1) > > parameter(lstore=lmvec*lsvec) > > c > > c NONCONFORMING STUFF > > c > > parameter (maxmor = lelt) > > C > > C Array dimensions > > C > > COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > > $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > > $,NXD,NYD,NZD > > > > c automatically added by makenek > > parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > > > > c automatically added by makenek > > parameter(lpart = 1 ) ! max number of particles > > > > c automatically added by makenek > > integer ax1,ay1,az1,ax2,ay2,az2 > > parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! > running > > averages > > > > c automatically added by makenek > > parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > > Preconditioner > > > > c automatically added by makenek > > parameter (lfdm=0) ! == 1 for fast diagonalization method > > > > Concerning Nek, yes I do get an output on the screen when lauching the > > calculation. Attached is the logfile for one example (2D cavity, only > with > > 100 elements though but the problem is the same as I increase the number > of > > elements). > > > > Cheers, > > > > On 18 October 2011 18:03, Jean-Christophe Loiseau > > wrote: > > > >> Hi, > >> > >> Sorry for this very late response, it completely got out of my mind. > Here > >> are the different outputs I get for these commands: > >> > >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > >> core file size (blocks, -c) 0 > >> data seg size (kbytes, -d) unlimited > >> scheduling priority (-e) 0 > >> file size (blocks, -f) unlimited > >> pending signals (-i) 31533 > >> max locked memory (kbytes, -l) 64 > >> max memory size (kbytes, -m) unlimited > >> open files (-n) 1024 > >> pipe size (512 bytes, -p) 8 > >> POSIX message queues (bytes, -q) 819200 > >> real-time priority (-r) 0 > >> stack size (kbytes, -s) 8192 > >> cpu time (seconds, -t) unlimited > >> max user processes (-u) 31533 > >> virtual memory (kbytes, -v) unlimited > >> file locks (-x) unlimited > >> > >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > >> text data bss dec hex filename > >> 2232834 3484 185835044 188071362 b35bdc2 nek5000 > >> > >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > >> C Dimension file to be included > >> C > >> C HCUBE array dimensions > >> C > >> parameter (ldim=2) > >> parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > >> parameter (lxd=17,lyd=lxd,lzd=1) > >> parameter (lelx=20,lely=60,lelz=1) > >> c > >> parameter (lzl=3 + 2*(ldim-3)) > >> c > >> parameter (lx2=lx1-2) > >> parameter (ly2=ly1-2) > >> parameter (lz2=1) > >> parameter (lx3=lx2) > >> parameter (ly3=ly2) > >> parameter (lz3=lz2) > >> c > >> c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > >> c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > >> c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > >> c > >> parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > >> parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > >> parameter (lpx2=1,lpy2=1,lpz2=1) > >> c > >> c > >> c parameter (lbelv=lelv,lbelt=lelt) ! MHD > >> c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > >> c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > >> c > >> parameter (lbelv=1,lbelt=1) ! MHD > >> parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > >> parameter (lbx2=1,lby2=1,lbz2=1) > >> c > >> C LX1M=LX1 when there are moving meshes; =1 otherwise > >> parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > >> parameter (ldimt= 4) ! 3 passive scalars + T > >> parameter (ldimt1=ldimt+1) > >> parameter (ldimt3=ldimt+3) > >> parameter (lp = 8) > >> parameter (lelg = 300) > >> c > >> c Note: In the new code, LELGEC should be about sqrt(LELG) > >> c > >> PARAMETER (LELGEC = 1) > >> PARAMETER (LXYZ2 = 1) > >> PARAMETER (LXZ21 = 1) > >> c > >> PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > >> PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > >> PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > >> PARAMETER (LXZ=LX1*LZ1) > >> PARAMETER (LORDER=3) > >> PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > >> C > >> C Common Block Dimensions > >> C > >> PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > >> PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > >> C > >> C The parameter LVEC controls whether an additional 42 field arrays > >> C are required for Steady State Solutions. If you are not using > >> C Steady State, it is recommended that LVEC=1. > >> C > >> PARAMETER (LVEC=1) > >> C > >> C Uzawa projection array dimensions > >> C > >> parameter (mxprev = 80) > >> parameter (lgmres = 40) > >> C > >> C Split projection array dimensions > >> C > >> parameter(lmvec = 1) > >> parameter(lsvec = 1) > >> parameter(lstore=lmvec*lsvec) > >> c > >> c NONCONFORMING STUFF > >> c > >> parameter (maxmor = lelt) > >> C > >> C Array dimensions > >> C > >> COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > >> $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > >> $,NXD,NYD,NZD > >> > >> c automatically added by makenek > >> parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > >> > >> c automatically added by makenek > >> parameter(lpart = 1 ) ! max number of particles > >> > >> c automatically added by makenek > >> integer ax1,ay1,az1,ax2,ay2,az2 > >> parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! > >> running > >> averages > >> > >> c automatically added by makenek > >> parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > >> Preconditioner > >> > >> c automatically added by makenek > >> parameter (lfdm=0) ! == 1 for fast diagonalization method > >> > >> Concerning Nek, yes I do get an output on the screen when lauching the > >> calculation. Attached is the logfile for one example (2D cavity, only > with > >> 100 elements though but the problem is the same as I increase the number > >> of > >> elements). > >> > >> Cheers, > >> > >> > >> On 27 September 2011 17:40, wrote: > >> > >>> Hi, > >>> > >>> Do you get any output printed to the screen when you launch Nek? > >>> Please provide the output of the following command: > >>> ulimit -a (in case you are using bash) > >>> size nek5000 > >>> cat SIZE > >>> > >>> Can you post the command you are using to launch Nek in parallel? > >>> > >>> Cheers, > >>> Stefan > >>> > >>> > >>> > >>> On 9/27/11, nek5000-users at lists.mcs.anl.gov > >>> wrote: > >>> > Hi Neks, > >>> > > >>> > I do get the same error when I try to run nek on my laptop. Did you > >>> > find > >>> any > >>> > way to solve the problem John? > >>> > > >>> > Regards, > >>> > > >>> > On 11 May 2011 21:25, wrote: > >>> > > >>> >> Hi, > >>> >> > >>> >> I have lp=512 in SIZE, and I did compile with IFMPI=true. I > >>> >> recompiled > >>> >> yesterday because I thought this may be the case, but I was still > >>> getting > >>> >> the same error. When I first start a simulation and run top, it > shows > >>> >> four > >>> >> processors running nek5000, but this changes to one processor after > a > >>> few > >>> >> seconds. > >>> >> > >>> >> Thanks, > >>> >> > >>> >> John > >>> >> > >>> >> _______________________________________________ > >>> >> Nek5000-users mailing list > >>> >> Nek5000-users at lists.mcs.anl.gov > >>> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >>> >> > >>> >> > >>> > > >>> > > >>> > -- > >>> > Jean-Christophe > >>> > > >>> _______________________________________________ > >>> Nek5000-users mailing list > >>> Nek5000-users at lists.mcs.anl.gov > >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >>> > >> > >> > >> > >> -- > >> Jean-Christophe > >> > > > > > > > > -- > > Jean-Christophe > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 18 12:22:16 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 19:22:16 +0200 Subject: [Nek5000-users] Running Simulations in Parallel with nekbmpi In-Reply-To: References: Message-ID: How do you launch Nek in parallel? Can you try to run Nek using mpiexec or mpirun w/o calling a script. -Stefan On 10/18/11, nek5000-users at lists.mcs.anl.gov wrote: > Here it is: > > #!/bin/bash > # Nek5000 build config file > # (c) 2008,2009,2010 UCHICAGO ARGONNE, LLC > > # source path > SOURCE_ROOT="$HOME/Codes/nek5_svn/trunk/nek" > > # Fortran compiler > F77="mpif77" > > # C compiler > CC="mpicc" > > # pre-processor symbol list > # (set PPLIST=? to get a list of available symbols) > #PPLIST="?" > > # plug-in list > PLUGIN_LIST="" > > > # OPTIONAL SETTINGS > # ----------------- > > # enable MPI (default true) > #IFMPI="false" > > # auxilliary files to compile > # NOTE: source files have to located in the same directory as makenek > # a makefile_usr.inc has to be provided containing the build rules > #USR="foo.o" > > # linking flags > #USR_LFLAGS="-L/usr/lib -lfoo" > > # generic compiler flags > #G="-g" > > # optimization flags > #OPT_FLAGS_STD="" > #OPT_FLAGS_MAG="" > > # enable AMG coarse grid solver (default XXT) > #IFAMG="true" > #IFAMG_DUMP="true" > > # CVODE path > #CVODE_DIR=$HOME/cvode/lib > > # MOAB/iMESH path > #MOAB_DIR="$HOME/moab" > > > ############################################################################### > # DONT'T TOUCH WHAT FOLLOWS !!! > ############################################################################### > # assign version tag > mver=1 > # overwrite source path with optional 2nd argument > if [ -d $2 ] && [ $# -eq 2 ]; then > SOURCE_ROOT="$2" > echo "change source code directory to: ", $SOURCE_ROOT > fi > # do some checks and create makefile > source $SOURCE_ROOT/makenek.inc > # compile > make -j4 -f makefile 2>&1 | tee compiler.out > exit 0 > > > On 18 October 2011 19:13, wrote: > >> Please post your makenek. >> -Stefan >> >> On 10/18/11, nek5000-users at lists.mcs.anl.gov >> wrote: >> > Hi, >> > >> > Sorry for this very late response, it completely got out of my mind. >> > Here >> > are the different outputs I get for these commands: >> > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a >> > core file size (blocks, -c) 0 >> > data seg size (kbytes, -d) unlimited >> > scheduling priority (-e) 0 >> > file size (blocks, -f) unlimited >> > pending signals (-i) 31533 >> > max locked memory (kbytes, -l) 64 >> > max memory size (kbytes, -m) unlimited >> > open files (-n) 1024 >> > pipe size (512 bytes, -p) 8 >> > POSIX message queues (bytes, -q) 819200 >> > real-time priority (-r) 0 >> > stack size (kbytes, -s) 8192 >> > cpu time (seconds, -t) unlimited >> > max user processes (-u) 31533 >> > virtual memory (kbytes, -v) unlimited >> > file locks (-x) unlimited >> > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 >> > text data bss dec hex filename >> > 2232834 3484 185835044 188071362 b35bdc2 nek5000 >> > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE >> > C Dimension file to be included >> > C >> > C HCUBE array dimensions >> > C >> > parameter (ldim=2) >> > parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) >> > parameter (lxd=17,lyd=lxd,lzd=1) >> > parameter (lelx=20,lely=60,lelz=1) >> > c >> > parameter (lzl=3 + 2*(ldim-3)) >> > c >> > parameter (lx2=lx1-2) >> > parameter (ly2=ly1-2) >> > parameter (lz2=1) >> > parameter (lx3=lx2) >> > parameter (ly3=ly2) >> > parameter (lz3=lz2) >> > c >> > c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation >> > c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes >> > c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) >> > c >> > parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation >> > parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes >> > parameter (lpx2=1,lpy2=1,lpz2=1) >> > c >> > c >> > c parameter (lbelv=lelv,lbelt=lelt) ! MHD >> > c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes >> > c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) >> > c >> > parameter (lbelv=1,lbelt=1) ! MHD >> > parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes >> > parameter (lbx2=1,lby2=1,lbz2=1) >> > c >> > C LX1M=LX1 when there are moving meshes; =1 otherwise >> > parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) >> > parameter (ldimt= 4) ! 3 passive scalars + T >> > parameter (ldimt1=ldimt+1) >> > parameter (ldimt3=ldimt+3) >> > parameter (lp = 8) >> > parameter (lelg = 300) >> > c >> > c Note: In the new code, LELGEC should be about sqrt(LELG) >> > c >> > PARAMETER (LELGEC = 1) >> > PARAMETER (LXYZ2 = 1) >> > PARAMETER (LXZ21 = 1) >> > c >> > PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) >> > PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) >> > PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) >> > PARAMETER (LXZ=LX1*LZ1) >> > PARAMETER (LORDER=3) >> > PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) >> > C >> > C Common Block Dimensions >> > C >> > PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) >> > PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) >> > C >> > C The parameter LVEC controls whether an additional 42 field arrays >> > C are required for Steady State Solutions. If you are not using >> > C Steady State, it is recommended that LVEC=1. >> > C >> > PARAMETER (LVEC=1) >> > C >> > C Uzawa projection array dimensions >> > C >> > parameter (mxprev = 80) >> > parameter (lgmres = 40) >> > C >> > C Split projection array dimensions >> > C >> > parameter(lmvec = 1) >> > parameter(lsvec = 1) >> > parameter(lstore=lmvec*lsvec) >> > c >> > c NONCONFORMING STUFF >> > c >> > parameter (maxmor = lelt) >> > C >> > C Array dimensions >> > C >> > COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 >> > $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID >> > $,NXD,NYD,NZD >> > >> > c automatically added by makenek >> > parameter(lxo = lx1) ! max output grid size (lxo>=lx1) >> > >> > c automatically added by makenek >> > parameter(lpart = 1 ) ! max number of particles >> > >> > c automatically added by makenek >> > integer ax1,ay1,az1,ax2,ay2,az2 >> > parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! >> running >> > averages >> > >> > c automatically added by makenek >> > parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure >> > Preconditioner >> > >> > c automatically added by makenek >> > parameter (lfdm=0) ! == 1 for fast diagonalization method >> > >> > Concerning Nek, yes I do get an output on the screen when lauching the >> > calculation. Attached is the logfile for one example (2D cavity, only >> with >> > 100 elements though but the problem is the same as I increase the number >> of >> > elements). >> > >> > Cheers, >> > >> > On 18 October 2011 18:03, Jean-Christophe Loiseau >> > wrote: >> > >> >> Hi, >> >> >> >> Sorry for this very late response, it completely got out of my mind. >> Here >> >> are the different outputs I get for these commands: >> >> >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a >> >> core file size (blocks, -c) 0 >> >> data seg size (kbytes, -d) unlimited >> >> scheduling priority (-e) 0 >> >> file size (blocks, -f) unlimited >> >> pending signals (-i) 31533 >> >> max locked memory (kbytes, -l) 64 >> >> max memory size (kbytes, -m) unlimited >> >> open files (-n) 1024 >> >> pipe size (512 bytes, -p) 8 >> >> POSIX message queues (bytes, -q) 819200 >> >> real-time priority (-r) 0 >> >> stack size (kbytes, -s) 8192 >> >> cpu time (seconds, -t) unlimited >> >> max user processes (-u) 31533 >> >> virtual memory (kbytes, -v) unlimited >> >> file locks (-x) unlimited >> >> >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 >> >> text data bss dec hex filename >> >> 2232834 3484 185835044 188071362 b35bdc2 nek5000 >> >> >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE >> >> C Dimension file to be included >> >> C >> >> C HCUBE array dimensions >> >> C >> >> parameter (ldim=2) >> >> parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) >> >> parameter (lxd=17,lyd=lxd,lzd=1) >> >> parameter (lelx=20,lely=60,lelz=1) >> >> c >> >> parameter (lzl=3 + 2*(ldim-3)) >> >> c >> >> parameter (lx2=lx1-2) >> >> parameter (ly2=ly1-2) >> >> parameter (lz2=1) >> >> parameter (lx3=lx2) >> >> parameter (ly3=ly2) >> >> parameter (lz3=lz2) >> >> c >> >> c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation >> >> c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes >> >> c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) >> >> c >> >> parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation >> >> parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes >> >> parameter (lpx2=1,lpy2=1,lpz2=1) >> >> c >> >> c >> >> c parameter (lbelv=lelv,lbelt=lelt) ! MHD >> >> c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes >> >> c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) >> >> c >> >> parameter (lbelv=1,lbelt=1) ! MHD >> >> parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes >> >> parameter (lbx2=1,lby2=1,lbz2=1) >> >> c >> >> C LX1M=LX1 when there are moving meshes; =1 otherwise >> >> parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) >> >> parameter (ldimt= 4) ! 3 passive scalars + >> >> T >> >> parameter (ldimt1=ldimt+1) >> >> parameter (ldimt3=ldimt+3) >> >> parameter (lp = 8) >> >> parameter (lelg = 300) >> >> c >> >> c Note: In the new code, LELGEC should be about sqrt(LELG) >> >> c >> >> PARAMETER (LELGEC = 1) >> >> PARAMETER (LXYZ2 = 1) >> >> PARAMETER (LXZ21 = 1) >> >> c >> >> PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) >> >> PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) >> >> PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) >> >> PARAMETER (LXZ=LX1*LZ1) >> >> PARAMETER (LORDER=3) >> >> PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) >> >> C >> >> C Common Block Dimensions >> >> C >> >> PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) >> >> PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) >> >> C >> >> C The parameter LVEC controls whether an additional 42 field arrays >> >> C are required for Steady State Solutions. If you are not using >> >> C Steady State, it is recommended that LVEC=1. >> >> C >> >> PARAMETER (LVEC=1) >> >> C >> >> C Uzawa projection array dimensions >> >> C >> >> parameter (mxprev = 80) >> >> parameter (lgmres = 40) >> >> C >> >> C Split projection array dimensions >> >> C >> >> parameter(lmvec = 1) >> >> parameter(lsvec = 1) >> >> parameter(lstore=lmvec*lsvec) >> >> c >> >> c NONCONFORMING STUFF >> >> c >> >> parameter (maxmor = lelt) >> >> C >> >> C Array dimensions >> >> C >> >> COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 >> >> $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID >> >> $,NXD,NYD,NZD >> >> >> >> c automatically added by makenek >> >> parameter(lxo = lx1) ! max output grid size (lxo>=lx1) >> >> >> >> c automatically added by makenek >> >> parameter(lpart = 1 ) ! max number of particles >> >> >> >> c automatically added by makenek >> >> integer ax1,ay1,az1,ax2,ay2,az2 >> >> parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! >> >> running >> >> averages >> >> >> >> c automatically added by makenek >> >> parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure >> >> Preconditioner >> >> >> >> c automatically added by makenek >> >> parameter (lfdm=0) ! == 1 for fast diagonalization method >> >> >> >> Concerning Nek, yes I do get an output on the screen when lauching the >> >> calculation. Attached is the logfile for one example (2D cavity, only >> with >> >> 100 elements though but the problem is the same as I increase the >> >> number >> >> of >> >> elements). >> >> >> >> Cheers, >> >> >> >> >> >> On 27 September 2011 17:40, wrote: >> >> >> >>> Hi, >> >>> >> >>> Do you get any output printed to the screen when you launch Nek? >> >>> Please provide the output of the following command: >> >>> ulimit -a (in case you are using bash) >> >>> size nek5000 >> >>> cat SIZE >> >>> >> >>> Can you post the command you are using to launch Nek in parallel? >> >>> >> >>> Cheers, >> >>> Stefan >> >>> >> >>> >> >>> >> >>> On 9/27/11, nek5000-users at lists.mcs.anl.gov >> >>> wrote: >> >>> > Hi Neks, >> >>> > >> >>> > I do get the same error when I try to run nek on my laptop. Did you >> >>> > find >> >>> any >> >>> > way to solve the problem John? >> >>> > >> >>> > Regards, >> >>> > >> >>> > On 11 May 2011 21:25, wrote: >> >>> > >> >>> >> Hi, >> >>> >> >> >>> >> I have lp=512 in SIZE, and I did compile with IFMPI=true. I >> >>> >> recompiled >> >>> >> yesterday because I thought this may be the case, but I was still >> >>> getting >> >>> >> the same error. When I first start a simulation and run top, it >> shows >> >>> >> four >> >>> >> processors running nek5000, but this changes to one processor after >> a >> >>> few >> >>> >> seconds. >> >>> >> >> >>> >> Thanks, >> >>> >> >> >>> >> John >> >>> >> >> >>> >> _______________________________________________ >> >>> >> Nek5000-users mailing list >> >>> >> Nek5000-users at lists.mcs.anl.gov >> >>> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >>> >> >> >>> >> >> >>> > >> >>> > >> >>> > -- >> >>> > Jean-Christophe >> >>> > >> >>> _______________________________________________ >> >>> Nek5000-users mailing list >> >>> Nek5000-users at lists.mcs.anl.gov >> >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> >>> >> >> >> >> >> >> >> >> -- >> >> Jean-Christophe >> >> >> > >> > >> > >> > -- >> > Jean-Christophe >> > >> _______________________________________________ >> Nek5000-users mailing list >> Nek5000-users at lists.mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users >> > > > > -- > Jean-Christophe > From nek5000-users at lists.mcs.anl.gov Tue Oct 18 13:19:05 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 13:19:05 -0500 (CDT) Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. In-Reply-To: Message-ID: <2146351622.106673.1318961945723.JavaMail.root@zimbra.anl.gov> No problem -- it is great that you found what was wrong. Best. Aleks ----- Original Message ----- From: nek5000-users at lists.mcs.anl.gov To: nek5000-users at lists.mcs.anl.gov Sent: Tuesday, October 18, 2011 11:19:44 AM Subject: Re: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. Oops... I really do feel like an idiot right now :) Thanks a lot On 18 October 2011 18:17, < nek5000-users at lists.mcs.anl.gov > wrote: Hi Jean-Christophe, Usually this error means that something wrong with the format in .rea file, e.g., certain lines are missing, like BCs, Restart, etc. The most common error with the same message I make is setting 0 PRESOLVE/RESTART OPTIONS ***** and giving a filename for the restart. Best, Aleks ----- Original Message ----- From: nek5000-users at lists.mcs.anl.gov To: "Nek 5000" < nek5000-users at lists.mcs.anl.gov > Sent: Tuesday, October 18, 2011 11:09:33 AM Subject: [Nek5000-users] ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. Hi Nek's, I have been trying to use the perturbation mode this morning, and I get the following error: /----------------------------------------------------------\\ | _ __ ______ __ __ ______ ____ ____ ____ | | / | / // ____// //_/ / ____/ / __ \\ / __ \\ / __ \\ | | / |/ // __/ / ,< /___ \\ / / / // / / // / / / | | / /| // /___ / /| | ____/ / / /_/ // /_/ // /_/ / | | /_/ |_//_____//_/ |_|/_____/ \\____/ \\____/ \\____/ | | | |----------------------------------------------------------| | | | NEK5000: Open Source Spectral Element Solver | | COPYRIGHT (c) 2008-2010 UCHICAGO ARGONNE, LLC | | Version: 1.0rc1 / SVN r714 | | Web: http://nek5000.mcs.anl.gov | | | \\----------------------------------------------------------/ Number of processors: 1 REAL wdsize : 8 INTEGER wdsize : 4 Beginning session: /home/jean-christophe/Calculs/Cavite2D/Stability/cav.rea timer accuracy: 0.0000000E+00 sec read .rea file nelgt/nelgv/lelt: 100 100 100 lx1 /lx2 /lx3 : 11 9 9 mapping elements to processors 0 100 100 100 100 NELV RANK 0 IEG 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 element load imbalance: 0 100 100 done :: mapping elements to processors ERROR READING INITIAL CONDITION/DRIVE FORCE DATA ABORTING IN ROUTINE RDICDF. call exitt: dying ... backtrace(): obtained 3 stack frames. ./nek5000() [0x821eb0d] ./nek5000() [0x833f6c6] [0xc] total elapsed time : 0.00000E+00 sec total solver time incl. I/O : 0.00000E+00 sec time/timestep : 0.00000E+00 sec CPU seconds/timestep/gridpt : 0.00000E+00 sec Not sure though where it comes from since it's been the very first time I encounter it. My nek version is 714 (the last one I guess). Any hint? Regards, -- Jean-Christophe _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users -- Jean-Christophe _______________________________________________ Nek5000-users mailing list Nek5000-users at lists.mcs.anl.gov https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users From nek5000-users at lists.mcs.anl.gov Tue Oct 18 14:30:04 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 18 Oct 2011 21:30:04 +0200 Subject: [Nek5000-users] Running Simulations in Parallel with nekbmpi In-Reply-To: References: Message-ID: I tried: 1. mpiexec -n 2 ./nek5000 2. ./nekbmpi blah 2 and I get the same problem for both. There is no big deal if it doesn't work on parallel since it is on my laptop, but it would be cool if it worked :) I'll continue to look for the problem. It might perhaps come from my mpich implementation. I'll try to run some other parallel codes and get back to you after. On 18 October 2011 19:22, wrote: > How do you launch Nek in parallel? Can you try to run Nek using > mpiexec or mpirun w/o calling a script. > -Stefan > > On 10/18/11, nek5000-users at lists.mcs.anl.gov > wrote: > > Here it is: > > > > #!/bin/bash > > # Nek5000 build config file > > # (c) 2008,2009,2010 UCHICAGO ARGONNE, LLC > > > > # source path > > SOURCE_ROOT="$HOME/Codes/nek5_svn/trunk/nek" > > > > # Fortran compiler > > F77="mpif77" > > > > # C compiler > > CC="mpicc" > > > > # pre-processor symbol list > > # (set PPLIST=? to get a list of available symbols) > > #PPLIST="?" > > > > # plug-in list > > PLUGIN_LIST="" > > > > > > # OPTIONAL SETTINGS > > # ----------------- > > > > # enable MPI (default true) > > #IFMPI="false" > > > > # auxilliary files to compile > > # NOTE: source files have to located in the same directory as makenek > > # a makefile_usr.inc has to be provided containing the build rules > > #USR="foo.o" > > > > # linking flags > > #USR_LFLAGS="-L/usr/lib -lfoo" > > > > # generic compiler flags > > #G="-g" > > > > # optimization flags > > #OPT_FLAGS_STD="" > > #OPT_FLAGS_MAG="" > > > > # enable AMG coarse grid solver (default XXT) > > #IFAMG="true" > > #IFAMG_DUMP="true" > > > > # CVODE path > > #CVODE_DIR=$HOME/cvode/lib > > > > # MOAB/iMESH path > > #MOAB_DIR="$HOME/moab" > > > > > > > ############################################################################### > > # DONT'T TOUCH WHAT FOLLOWS !!! > > > ############################################################################### > > # assign version tag > > mver=1 > > # overwrite source path with optional 2nd argument > > if [ -d $2 ] && [ $# -eq 2 ]; then > > SOURCE_ROOT="$2" > > echo "change source code directory to: ", $SOURCE_ROOT > > fi > > # do some checks and create makefile > > source $SOURCE_ROOT/makenek.inc > > # compile > > make -j4 -f makefile 2>&1 | tee compiler.out > > exit 0 > > > > > > On 18 October 2011 19:13, wrote: > > > >> Please post your makenek. > >> -Stefan > >> > >> On 10/18/11, nek5000-users at lists.mcs.anl.gov > >> wrote: > >> > Hi, > >> > > >> > Sorry for this very late response, it completely got out of my mind. > >> > Here > >> > are the different outputs I get for these commands: > >> > > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > >> > core file size (blocks, -c) 0 > >> > data seg size (kbytes, -d) unlimited > >> > scheduling priority (-e) 0 > >> > file size (blocks, -f) unlimited > >> > pending signals (-i) 31533 > >> > max locked memory (kbytes, -l) 64 > >> > max memory size (kbytes, -m) unlimited > >> > open files (-n) 1024 > >> > pipe size (512 bytes, -p) 8 > >> > POSIX message queues (bytes, -q) 819200 > >> > real-time priority (-r) 0 > >> > stack size (kbytes, -s) 8192 > >> > cpu time (seconds, -t) unlimited > >> > max user processes (-u) 31533 > >> > virtual memory (kbytes, -v) unlimited > >> > file locks (-x) unlimited > >> > > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > >> > text data bss dec hex filename > >> > 2232834 3484 185835044 188071362 b35bdc2 nek5000 > >> > > >> > jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > >> > C Dimension file to be included > >> > C > >> > C HCUBE array dimensions > >> > C > >> > parameter (ldim=2) > >> > parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > >> > parameter (lxd=17,lyd=lxd,lzd=1) > >> > parameter (lelx=20,lely=60,lelz=1) > >> > c > >> > parameter (lzl=3 + 2*(ldim-3)) > >> > c > >> > parameter (lx2=lx1-2) > >> > parameter (ly2=ly1-2) > >> > parameter (lz2=1) > >> > parameter (lx3=lx2) > >> > parameter (ly3=ly2) > >> > parameter (lz3=lz2) > >> > c > >> > c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > >> > c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > >> > c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > >> > c > >> > parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > >> > parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > >> > parameter (lpx2=1,lpy2=1,lpz2=1) > >> > c > >> > c > >> > c parameter (lbelv=lelv,lbelt=lelt) ! MHD > >> > c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > >> > c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > >> > c > >> > parameter (lbelv=1,lbelt=1) ! MHD > >> > parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > >> > parameter (lbx2=1,lby2=1,lbz2=1) > >> > c > >> > C LX1M=LX1 when there are moving meshes; =1 otherwise > >> > parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > >> > parameter (ldimt= 4) ! 3 passive scalars + > T > >> > parameter (ldimt1=ldimt+1) > >> > parameter (ldimt3=ldimt+3) > >> > parameter (lp = 8) > >> > parameter (lelg = 300) > >> > c > >> > c Note: In the new code, LELGEC should be about sqrt(LELG) > >> > c > >> > PARAMETER (LELGEC = 1) > >> > PARAMETER (LXYZ2 = 1) > >> > PARAMETER (LXZ21 = 1) > >> > c > >> > PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > >> > PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > >> > PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > >> > PARAMETER (LXZ=LX1*LZ1) > >> > PARAMETER (LORDER=3) > >> > PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > >> > C > >> > C Common Block Dimensions > >> > C > >> > PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > >> > PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > >> > C > >> > C The parameter LVEC controls whether an additional 42 field > arrays > >> > C are required for Steady State Solutions. If you are not using > >> > C Steady State, it is recommended that LVEC=1. > >> > C > >> > PARAMETER (LVEC=1) > >> > C > >> > C Uzawa projection array dimensions > >> > C > >> > parameter (mxprev = 80) > >> > parameter (lgmres = 40) > >> > C > >> > C Split projection array dimensions > >> > C > >> > parameter(lmvec = 1) > >> > parameter(lsvec = 1) > >> > parameter(lstore=lmvec*lsvec) > >> > c > >> > c NONCONFORMING STUFF > >> > c > >> > parameter (maxmor = lelt) > >> > C > >> > C Array dimensions > >> > C > >> > COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > >> > $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > >> > $,NXD,NYD,NZD > >> > > >> > c automatically added by makenek > >> > parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > >> > > >> > c automatically added by makenek > >> > parameter(lpart = 1 ) ! max number of particles > >> > > >> > c automatically added by makenek > >> > integer ax1,ay1,az1,ax2,ay2,az2 > >> > parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! > >> running > >> > averages > >> > > >> > c automatically added by makenek > >> > parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > >> > Preconditioner > >> > > >> > c automatically added by makenek > >> > parameter (lfdm=0) ! == 1 for fast diagonalization method > >> > > >> > Concerning Nek, yes I do get an output on the screen when lauching the > >> > calculation. Attached is the logfile for one example (2D cavity, only > >> with > >> > 100 elements though but the problem is the same as I increase the > number > >> of > >> > elements). > >> > > >> > Cheers, > >> > > >> > On 18 October 2011 18:03, Jean-Christophe Loiseau > >> > wrote: > >> > > >> >> Hi, > >> >> > >> >> Sorry for this very late response, it completely got out of my mind. > >> Here > >> >> are the different outputs I get for these commands: > >> >> > >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ ulimit -a > >> >> core file size (blocks, -c) 0 > >> >> data seg size (kbytes, -d) unlimited > >> >> scheduling priority (-e) 0 > >> >> file size (blocks, -f) unlimited > >> >> pending signals (-i) 31533 > >> >> max locked memory (kbytes, -l) 64 > >> >> max memory size (kbytes, -m) unlimited > >> >> open files (-n) 1024 > >> >> pipe size (512 bytes, -p) 8 > >> >> POSIX message queues (bytes, -q) 819200 > >> >> real-time priority (-r) 0 > >> >> stack size (kbytes, -s) 8192 > >> >> cpu time (seconds, -t) unlimited > >> >> max user processes (-u) 31533 > >> >> virtual memory (kbytes, -v) unlimited > >> >> file locks (-x) unlimited > >> >> > >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ size nek5000 > >> >> text data bss dec hex filename > >> >> 2232834 3484 185835044 188071362 b35bdc2 nek5000 > >> >> > >> >> jean-christophe at dyf-corto:~/Calculs/Cavite2D/BaseFlow$ cat SIZE > >> >> C Dimension file to be included > >> >> C > >> >> C HCUBE array dimensions > >> >> C > >> >> parameter (ldim=2) > >> >> parameter (lx1=11,ly1=lx1,lz1=1,lelt=225,lelv=lelt) > >> >> parameter (lxd=17,lyd=lxd,lzd=1) > >> >> parameter (lelx=20,lely=60,lelz=1) > >> >> c > >> >> parameter (lzl=3 + 2*(ldim-3)) > >> >> c > >> >> parameter (lx2=lx1-2) > >> >> parameter (ly2=ly1-2) > >> >> parameter (lz2=1) > >> >> parameter (lx3=lx2) > >> >> parameter (ly3=ly2) > >> >> parameter (lz3=lz2) > >> >> c > >> >> c parameter (lpelv=lelv,lpelt=lelt,lpert=3) ! perturbation > >> >> c parameter (lpx1=lx1,lpy1=ly1,lpz1=lz1) ! array sizes > >> >> c parameter (lpx2=lx2,lpy2=ly2,lpz2=lz2) > >> >> c > >> >> parameter (lpelv=1,lpelt=1,lpert=1) ! perturbation > >> >> parameter (lpx1=1,lpy1=1,lpz1=1) ! array sizes > >> >> parameter (lpx2=1,lpy2=1,lpz2=1) > >> >> c > >> >> c > >> >> c parameter (lbelv=lelv,lbelt=lelt) ! MHD > >> >> c parameter (lbx1=lx1,lby1=ly1,lbz1=lz1) ! array sizes > >> >> c parameter (lbx2=lx2,lby2=ly2,lbz2=lz2) > >> >> c > >> >> parameter (lbelv=1,lbelt=1) ! MHD > >> >> parameter (lbx1=1,lby1=1,lbz1=1) ! array sizes > >> >> parameter (lbx2=1,lby2=1,lbz2=1) > >> >> c > >> >> C LX1M=LX1 when there are moving meshes; =1 otherwise > >> >> parameter (lx1m=lx1,ly1m=ly1,lz1m=lz1) > >> >> parameter (ldimt= 4) ! 3 passive scalars > + > >> >> T > >> >> parameter (ldimt1=ldimt+1) > >> >> parameter (ldimt3=ldimt+3) > >> >> parameter (lp = 8) > >> >> parameter (lelg = 300) > >> >> c > >> >> c Note: In the new code, LELGEC should be about sqrt(LELG) > >> >> c > >> >> PARAMETER (LELGEC = 1) > >> >> PARAMETER (LXYZ2 = 1) > >> >> PARAMETER (LXZ21 = 1) > >> >> c > >> >> PARAMETER (LMAXV=LX1*LY1*LZ1*LELV) > >> >> PARAMETER (LMAXT=LX1*LY1*LZ1*LELT) > >> >> PARAMETER (LMAXP=LX2*LY2*LZ2*LELV) > >> >> PARAMETER (LXZ=LX1*LZ1) > >> >> PARAMETER (LORDER=3) > >> >> PARAMETER (MAXOBJ=4,MAXMBR=LELT*6,lhis=100) > >> >> C > >> >> C Common Block Dimensions > >> >> C > >> >> PARAMETER (LCTMP0 =2*LX1*LY1*LZ1*LELT) > >> >> PARAMETER (LCTMP1 =4*LX1*LY1*LZ1*LELT) > >> >> C > >> >> C The parameter LVEC controls whether an additional 42 field > arrays > >> >> C are required for Steady State Solutions. If you are not using > >> >> C Steady State, it is recommended that LVEC=1. > >> >> C > >> >> PARAMETER (LVEC=1) > >> >> C > >> >> C Uzawa projection array dimensions > >> >> C > >> >> parameter (mxprev = 80) > >> >> parameter (lgmres = 40) > >> >> C > >> >> C Split projection array dimensions > >> >> C > >> >> parameter(lmvec = 1) > >> >> parameter(lsvec = 1) > >> >> parameter(lstore=lmvec*lsvec) > >> >> c > >> >> c NONCONFORMING STUFF > >> >> c > >> >> parameter (maxmor = lelt) > >> >> C > >> >> C Array dimensions > >> >> C > >> >> COMMON/DIMN/NELV,NELT,NX1,NY1,NZ1,NX2,NY2,NZ2 > >> >> $,NX3,NY3,NZ3,NDIM,NFIELD,NPERT,NID > >> >> $,NXD,NYD,NZD > >> >> > >> >> c automatically added by makenek > >> >> parameter(lxo = lx1) ! max output grid size (lxo>=lx1) > >> >> > >> >> c automatically added by makenek > >> >> parameter(lpart = 1 ) ! max number of particles > >> >> > >> >> c automatically added by makenek > >> >> integer ax1,ay1,az1,ax2,ay2,az2 > >> >> parameter (ax1=lx1,ay1=ly1,az1=lz1,ax2=lx2,ay2=ly2,az2=lz2) ! > >> >> running > >> >> averages > >> >> > >> >> c automatically added by makenek > >> >> parameter (lxs=1,lys=lxs,lzs=(lxs-1)*(ldim-2)+1) !New Pressure > >> >> Preconditioner > >> >> > >> >> c automatically added by makenek > >> >> parameter (lfdm=0) ! == 1 for fast diagonalization method > >> >> > >> >> Concerning Nek, yes I do get an output on the screen when lauching > the > >> >> calculation. Attached is the logfile for one example (2D cavity, only > >> with > >> >> 100 elements though but the problem is the same as I increase the > >> >> number > >> >> of > >> >> elements). > >> >> > >> >> Cheers, > >> >> > >> >> > >> >> On 27 September 2011 17:40, wrote: > >> >> > >> >>> Hi, > >> >>> > >> >>> Do you get any output printed to the screen when you launch Nek? > >> >>> Please provide the output of the following command: > >> >>> ulimit -a (in case you are using bash) > >> >>> size nek5000 > >> >>> cat SIZE > >> >>> > >> >>> Can you post the command you are using to launch Nek in parallel? > >> >>> > >> >>> Cheers, > >> >>> Stefan > >> >>> > >> >>> > >> >>> > >> >>> On 9/27/11, nek5000-users at lists.mcs.anl.gov > >> >>> wrote: > >> >>> > Hi Neks, > >> >>> > > >> >>> > I do get the same error when I try to run nek on my laptop. Did > you > >> >>> > find > >> >>> any > >> >>> > way to solve the problem John? > >> >>> > > >> >>> > Regards, > >> >>> > > >> >>> > On 11 May 2011 21:25, wrote: > >> >>> > > >> >>> >> Hi, > >> >>> >> > >> >>> >> I have lp=512 in SIZE, and I did compile with IFMPI=true. I > >> >>> >> recompiled > >> >>> >> yesterday because I thought this may be the case, but I was still > >> >>> getting > >> >>> >> the same error. When I first start a simulation and run top, it > >> shows > >> >>> >> four > >> >>> >> processors running nek5000, but this changes to one processor > after > >> a > >> >>> few > >> >>> >> seconds. > >> >>> >> > >> >>> >> Thanks, > >> >>> >> > >> >>> >> John > >> >>> >> > >> >>> >> _______________________________________________ > >> >>> >> Nek5000-users mailing list > >> >>> >> Nek5000-users at lists.mcs.anl.gov > >> >>> >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >> >>> >> > >> >>> >> > >> >>> > > >> >>> > > >> >>> > -- > >> >>> > Jean-Christophe > >> >>> > > >> >>> _______________________________________________ > >> >>> Nek5000-users mailing list > >> >>> Nek5000-users at lists.mcs.anl.gov > >> >>> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >> >>> > >> >> > >> >> > >> >> > >> >> -- > >> >> Jean-Christophe > >> >> > >> > > >> > > >> > > >> > -- > >> > Jean-Christophe > >> > > >> _______________________________________________ > >> Nek5000-users mailing list > >> Nek5000-users at lists.mcs.anl.gov > >> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > >> > > > > > > > > -- > > Jean-Christophe > > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Wed Oct 19 15:22:18 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Wed, 19 Oct 2011 16:22:18 -0400 Subject: [Nek5000-users] Boundary conditions for pressure Message-ID: <1319055738.15579.36.camel@menorca.uottawa.ca> Dear Nek users, according to the information provided, for the Pn-Pn case using a splitting temporal discretization, the boundary conditions for the pressure are computed chosen to ensure div.U=0. This involves addition of an appropriate inhomogeneity that cancels the line integral of the divergence. How is this inhomogeneity computed?. Could you provide more details?. Thanks. -- Alexandre Fabregat, PhD. Mail: afabrega at uottawa.ca Phone: 630 562-5800 (ext. 6293) Office: B207 Mechanical Engineering Department University of Ottawa 161 Louis Pasteur, K1N6N5, Ottawa ON From nek5000-users at lists.mcs.anl.gov Thu Oct 20 05:04:14 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 20 Oct 2011 12:04:14 +0200 Subject: [Nek5000-users] Pressure Normalization Message-ID: <1319105054.31769.17.camel@damavand.mech.kth.se> Dear Neks; I would like to know how the pressure normalization is done in the code. Since I'm running a turbulent pipe flow with periodic B.C.s using Pn_Pn-2 scheme, I would like to obtain P_rms values; however, the pressure fluctuations does not look alright. I think in case of inflow/outflow, the conditions are to set the pressure to be zero at these boundaries; obviously this is not possible for periodic cases. Many thanks; Azad From nek5000-users at lists.mcs.anl.gov Thu Oct 20 05:25:45 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 20 Oct 2011 05:25:45 -0500 (CDT) Subject: [Nek5000-users] Pressure Normalization In-Reply-To: <1319105054.31769.17.camel@damavand.mech.kth.se> References: <1319105054.31769.17.camel@damavand.mech.kth.se> Message-ID: Hi Azad, Any constant can be added to p in the case you describe. Consequently, we choose the constant such that the spatial average of p is zero at each timestep. Does this answer your question? Paul On Thu, 20 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Dear Neks; > > I would like to know how the pressure normalization is done in the > code. Since I'm running a turbulent pipe flow with periodic B.C.s using > Pn_Pn-2 scheme, I would like to obtain P_rms values; however, the > pressure fluctuations does not look alright. I think in case of > inflow/outflow, the conditions are to set the pressure to be zero > at these boundaries; obviously this is not possible for periodic cases. > > Many thanks; > Azad > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > From nek5000-users at lists.mcs.anl.gov Thu Oct 20 13:32:49 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Thu, 20 Oct 2011 20:32:49 +0200 Subject: [Nek5000-users] Pressure Normalization In-Reply-To: References: Message-ID: <1319135569.10151.23.camel@damavand.mech.kth.se> Hi Paul; Yes, that's the answer I was looking for. Though I appreciate if you point me out to the right direction, since I'd like to know: 1. Where the pressure constant is computed in the code (i.e. the volumetric average). 2. whether there is an easy way to change it to the wall average... As for these typical turbulent problems having a constant mean wall pressure (integ{p}dA @ wall = 0) is more preferable than volumetric average(integ{p}dv = 0). Many thanks, Azad On Thu, 2011-10-20 at 12:00 -0500, nek5000-users-request at lists.mcs.anl.gov wrote: > Send Nek5000-users mailing list submissions to > nek5000-users at lists.mcs.anl.gov > > To subscribe or unsubscribe via the World Wide Web, visit > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users > or, via email, send a message with subject or body 'help' to > nek5000-users-request at lists.mcs.anl.gov > > You can reach the person managing the list at > nek5000-users-owner at lists.mcs.anl.gov > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Nek5000-users digest..." > > > Today's Topics: > > 1. Boundary conditions for pressure (nek5000-users at lists.mcs.anl.gov) > 2. Pressure Normalization (nek5000-users at lists.mcs.anl.gov) > 3. Re: Pressure Normalization (nek5000-users at lists.mcs.anl.gov) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 19 Oct 2011 16:22:18 -0400 > From: nek5000-users at lists.mcs.anl.gov > Subject: [Nek5000-users] Boundary conditions for pressure > To: nek5000-users at lists.mcs.anl.gov > Message-ID: <1319055738.15579.36.camel at menorca.uottawa.ca> > Content-Type: text/plain; charset="UTF-8" > > Dear Nek users, > > according to the information provided, for the Pn-Pn case using a splitting temporal discretization, > the boundary conditions for the pressure are computed chosen to ensure div.U=0. > This involves addition of an appropriate inhomogeneity > that cancels the line integral of the divergence. > > How is this inhomogeneity computed?. Could you provide more details?. > > Thanks. > From nek5000-users at lists.mcs.anl.gov Fri Oct 21 04:54:09 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Fri, 21 Oct 2011 11:54:09 +0200 Subject: [Nek5000-users] Linearized Nek: Compute only the right-hand side Message-ID: Hi Nek's, Given a random initial vector u and the Jacobian matrix of the linearized Navier-Stokes, I would like to calculate the matrix-vector product J * u. In Nek, that would consists in calculating only the right hand side of the equations, that is: RHS = ( -U.grad(u) - u.grad(U) - grad(p) + 1/Re Lap(u) ; div(u) ) I have a few questions on how to compute those different terms in userchk (Nek is used in post-process mode): - I would tend to use convop() for the convection terms, however I'm not sure whereas it would output the strong or weak form (and I need the strong one obviously)? - Same question regarding grad(p) and div(u) if I use opgradt() and opdiv() to calculate them? Last but not least, for the Laplacian I've read a previous mail where it is said that it can be computed using a sequence of wgradm1() and vec_dssum() so I guess I kown how to compute at the strong form for it. Sincerely, -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Fri Oct 21 05:37:16 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Fri, 21 Oct 2011 05:37:16 -0500 (CDT) Subject: [Nek5000-users] Linearized Nek: Compute only the right-hand side In-Reply-To: References: Message-ID: Jean-Christophe, Have you already looked at the perturbation code in nek? It already does these things. Also, it's not clear that you want everything in strong form, particularly the 2nd-order spatial terms, since the solution is only C0? Paul On Fri, 21 Oct 2011, nek5000-users at lists.mcs.anl.gov wrote: > Hi Nek's, > > Given a random initial vector u and the Jacobian matrix of the linearized > Navier-Stokes, I would like to calculate the matrix-vector product J * u. In > Nek, that would consists in calculating only the right hand side of the > equations, that is: > > RHS = ( -U.grad(u) - u.grad(U) - grad(p) + 1/Re Lap(u) ; div(u) ) > > I have a few questions on how to compute those different terms in userchk > (Nek is used in post-process mode): > > - I would tend to use convop() for the convection terms, however I'm not > sure whereas it would output the strong or weak form (and I need the strong > one obviously)? > - Same question regarding grad(p) and div(u) if I use opgradt() and > opdiv() to calculate them? > > Last but not least, for the Laplacian I've read a previous mail where it is > said that it can be computed using a sequence of wgradm1() and vec_dssum() > so I guess I kown how to compute at the strong form for it. > > Sincerely, > -- > Jean-Christophe > From nek5000-users at lists.mcs.anl.gov Fri Oct 21 05:53:01 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Fri, 21 Oct 2011 12:53:01 +0200 Subject: [Nek5000-users] Linearized Nek: Compute only the right-hand side In-Reply-To: References: Message-ID: Yep, I have been looking at it and have used it already quite extensively. Up to now, for stability analysis, I have been using snapshots taken from the Linearized DNS and then used them to form a Krylov subspace for an Arnoldi algorithm. However, what I would like to do know is to use Nek as blackbox for matrix-vector product only. In the perturbation mode, eventhough the Jacobian matrix is not explicitely formed, the equations still read du/dt = J * u What I am interested in is, given u(0), to output only: u(1) = J * u(0) u(2) = J * u(1) ... u(N) = J * u(N-1) Not sure though if that's clear enough? On 21 October 2011 12:37, wrote: > > Jean-Christophe, > > Have you already looked at the perturbation code in nek? > It already does these things. > > Also, it's not clear that you want everything in strong > form, particularly the 2nd-order spatial terms, since > the solution is only C0? > > Paul > > > > On Fri, 21 Oct 2011, nek5000-users at lists.mcs.anl.**govwrote: > > > Hi Nek's, >> >> Given a random initial vector u and the Jacobian matrix of the linearized >> Navier-Stokes, I would like to calculate the matrix-vector product J * u. >> In >> Nek, that would consists in calculating only the right hand side of the >> equations, that is: >> >> RHS = ( -U.grad(u) - u.grad(U) - grad(p) + 1/Re Lap(u) ; div(u) ) >> >> I have a few questions on how to compute those different terms in userchk >> (Nek is used in post-process mode): >> >> - I would tend to use convop() for the convection terms, however I'm not >> >> sure whereas it would output the strong or weak form (and I need the >> strong >> one obviously)? >> - Same question regarding grad(p) and div(u) if I use opgradt() and >> >> opdiv() to calculate them? >> >> Last but not least, for the Laplacian I've read a previous mail where it >> is >> said that it can be computed using a sequence of wgradm1() and vec_dssum() >> so I guess I kown how to compute at the strong form for it. >> >> Sincerely, >> -- >> Jean-Christophe >> >> ______________________________**_________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.**gov > https://lists.mcs.anl.gov/**mailman/listinfo/nek5000-users > -- Jean-Christophe -------------- next part -------------- An HTML attachment was scrubbed... URL: From nek5000-users at lists.mcs.anl.gov Tue Oct 25 06:06:02 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Tue, 25 Oct 2011 13:06:02 +0200 Subject: [Nek5000-users] Pipe flow and pressure normalization Message-ID: <20111025130602.20333gaa6xy6qnmi@www.mech.kth.se> Dear Neks, I am running a turbulent pipe flow simulation with periodic boundary conditions. At this point, the pressure normalization is set such that the spatial average of p is zero at each time-step. This is totally fine. However, I would like to change the normalization such that the spatial average of p on the wall is zero at each time step, i.e. in order to get the average pressure

= 0 on the wall. Could I get some guidance on where/how to change the normalization in the code. The way the pressure is normalized affects the prms values obtained from the run. This is in addition to other statistical quantities such as the skewness and flatness of p. Best regards George ---------------------------------------------------------------- This message was sent using IMP, the Internet Messaging Program. From nek5000-users at lists.mcs.anl.gov Sat Oct 29 11:50:06 2011 From: nek5000-users at lists.mcs.anl.gov (nek5000-users at lists.mcs.anl.gov) Date: Sat, 29 Oct 2011 19:50:06 +0300 Subject: [Nek5000-users] Post-processing dump files to generate Lamda-2 criterion Message-ID: <5beece2f191f94c1ee1848ddca246fe1@mail.ntua.gr> Dears Neks I have used the solver to run 3D simulations for the moving cylinder problem. I have cyl3d0.f000* and enscyl3d0.f000* dump files that I need to post process in order to generate the Lambda-2 vortex criterion. Please advise how I will load my dump files and run the solver in post processing mode to apply the criterion (I see that there is subroutine lamda2(l2) in postpro.f ). I have already read the example in http://nek5000.mcs.anl.gov/index.php/Data_processing_example but it was not clear to me. Regards Sofia