From fischer at mcs.anl.gov Wed Jul 8 05:46:26 2009 From: fischer at mcs.anl.gov (Paul Fischer) Date: Wed, 8 Jul 2009 05:46:26 -0500 (CDT) Subject: [Nek5000-users] Important nek5000 optimizations Message-ID: There are two optimizations of which you should all be aware: (1) I've encounterd some SIZEu files that have the following definition of the dealias array sizes: parameter (lxd=1+3*lx1/2,lyd=lxd,lzd=lxd*(ldim-2)+(3-ldim)) In this case, we can have, for lx1 = 6 lxd = 10 lx1 = 8 lxd = 13 lx1 = 10 lxd = 16 lx1 = 12 lxd = 19 Note that for lx1=8 and 12 we have lxd odd. On BG/P (and some, but not all, other architectures past and future), having odd dimensions for any of the SEM array sizes can be very detrimental to performance because it means that the array accesses will not be quad-aligned (or double-aligned for single precision cases). Effective use of the BG/P (and BG/L and BG/Q) double-hummer requires quad-aligned data. If lx1,lx2, and lxd are even numbers then data alignment in Nek5000 is fairly assured. Please check your SIZEu files if you are using the BG platforms. (2) I have a uploaded a new genmap code to the repo for generation of element-to-processor mappings. It appears to greatly improve the partitions, particularly for large element/processor counts. The old code would occasionally produce isolated subsets of elements that would in turn lead to high communication overhead. If you are using the XXt (default) coarse-grid solver, these new maps should also improve XXt performance. If you are using the AMG-based coarse-grid solver, you must regenerate all amg*dat files because the new genmap produces a different vertex ordering than the old. In my largest cases to date (3 million elements, 65000 processors), I'm seeing a factor of two reduction in CPU time. (I formerly had some very poor partition sets in this particular case...) From speppa at mail.ntua.gr Tue Jul 21 06:04:48 2009 From: speppa at mail.ntua.gr (Sofia) Date: Tue, 21 Jul 2009 14:04:48 +0300 Subject: [Nek5000-users] Error OpenMPI Message-ID: <003701ca09f3$0ffad480$44656693@naval.ntua.gr> Dear usres, When I try to run the solver 'nek5000' using OpenMPI (1.2.7-1) in a debian Linux installation I get a "Segmentation fault" fatal error. The relevant file (logfile.txt) with error message is attached. Has anybody faced similar difficulties? SP -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: logfile.txt URL: From kerkemeier at lav.mavt.ethz.ch Tue Jul 21 07:07:30 2009 From: kerkemeier at lav.mavt.ethz.ch (Stefan Kerkemeier) Date: Tue, 21 Jul 2009 14:07:30 +0200 Subject: [Nek5000-users] Error OpenMPI In-Reply-To: <003701ca09f3$0ffad480$44656693@naval.ntua.gr> References: <003701ca09f3$0ffad480$44656693@naval.ntua.gr> Message-ID: <4A65AF82.4010409@lav.mavt.ethz.ch> Hi, as far as I remember I fixed a couple of OpenMPI issues a few weeks ago. Can you try it again using the latest repo version? -Stefan Sofia wrote: > Dear usres, > > When I try to run the solver 'nek5000' using OpenMPI (1.2.7-1) in a > debian Linux installation I get a "Segmentation fault" fatal error. > The relevant file (logfile.txt) with error message is attached. > > Has anybody faced similar difficulties? > > SP > ------------------------------------------------------------------------ > > _______________________________________________ > Nek5000-users mailing list > Nek5000-users at lists.mcs.anl.gov > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users