HP ProLiant DL580 G3 server Lotus NotesBench audit report
38 pages
Slovak

HP ProLiant DL580 G3 server Lotus NotesBench audit report

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
38 pages
Slovak
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

HP ProLiant DL580 G3 server Lotus NotesBench audit report Abstract The HP ProLiant DL580 G3 server with Microsoft® Windows® Server 2003, Enterprise Edition delivers high performance for users of Lotus Domino Web Access. With up to four Dual-Core Intel® Xeon™ 3.0 GHz/667 MHz processors available with 2x2MB L2 cache, and numerous high availability features, including front-accessible Hot Plug RAID Memory, integrated Ultra320 Smart Array 6i controller, duplex drive cage, and hot-plug redundant components, the ProLiant DL580 G3 server is an excellent platform for users looking to expand performance to their Domino installations. This audit report details the configuration and testing of R6iNotes (Domino Web Access) for Domino 7.0 running on Microsoft Windows Server 2003, Enterprise Edition. Results The R6iNotes workload was run to simulate 18,500 R6iNotes users against four Domino 7.0 Servers. The test ran eight hours during which the system under test achieved 15,953 NotesMark (transactions per minute or tpm) with an average response time of 0.434 seconds. Based on these results, the price/performance ratio is $4.15/User (or $4.81/NotesMark). Table of Contents Abstract.............................................................................................................................................. 1 Results............. 1 Table of Contents.. 2 Section 1 – executive summary........................................................ ...

Informations

Publié par
Nombre de lectures 125
Langue Slovak

Extrait

HP ProLiant DL580 G3 server Lotus NotesBench audit report
Abstract The HP ProLiant DL580 G3 server with Microsoft® Windows® Server 2003, Enterprise Edition delivers high performance for users of Lotus Domino Web Access. With up to four Dual-Core Intel® Xeon 3.0 GHz/667 MHz processors available with 2x2MB L2 cache, and numerous high availability features, including front-accessible Hot Plug RAID Memory, integrated Ultra320 Smart Array 6i controller, duplex drive cage, and hot-plug redundant components, the ProLiant DL580 G3 server is an excellent platform for users looking to expand performance to their Domino installations. This audit report details the configuration and testing of R6iNotes (Domino Web Access) for Domino 7.0 running on Microsoft Windows Server 2003, Enterprise Edition. Results The R6iNotes workload was run to simulate18,500 R6iNotes usersagainst four Domino 7.0 Servers. The test ran eight hours during which the system under test achieved15,953 NotesMark(transactions per minute or tpm) with anaverage response time of 0.434 seconds. Based on these results, the price/performance ratio is$4.15/User (or $4.81/NotesMark).
Table of Contents Abstract..............................................................................................................................................1 Results ............................................................................................................................................ 1 Table of Contents................................................................................................................................. 2 Section 1  executive summary.............................................................................................................. 3 Test sponsor .................................................................................................................................... 3 Test methodologies........................................................................................................................... 3 Conclusions.....................................................................................................................................3 Section 2  benchmark objectives .......................................................................................................... 4 Section 3  test methodologies .............................................................................................................. 5 Test configuration............................................................................................................................. 5 Testprocedure.................................................................................................................................5 Section 4  R6iNotes data .................................................................................................................... 7 Lotus NotesMark value for benchmarked configuration ......................................................................... 7 Response Time value for benchmarked configuration ............................................................................ 8 R6iNotes users value for benchmarked configuration............................................................................ 9 CPU performance as recorded by Windows Performance Monitor ....................................................... 10 Disk performance as recorded by Windows Performance Monitor ....................................................... 11 Section 5  analysis ........................................................................................................................... 12 Price/performance ratios ................................................................................................................ 12 NotesNum output........................................................................................................................... 12 Section 6  conclusions ...................................................................................................................... 13 Section 7 statement by auditor........................................................................................................... 14 Appendix A  overall test setup and software versions ........................................................................... 15 Test setup ...................................................................................................................................... 15 Appendix B  system configuration ...................................................................................................... 17 SUT (system under test) ................................................................................................................... 17 Clients .......................................................................................................................................... 17 HP ProLiant BL10e server: ............................................................................................................... 17 Network equipment ........................................................................................................................ 17 Software versions ........................................................................................................................... 17 Appendix C  operating system parameters values ................................................................................ 18 Appendix D  Lotus Notes parameters.................................................................................................. 19 R6iNotes workload test ................................................................................................................... 19 System under test Notes.ini .......................................................................................................... 19 Parent Notes.ini ......................................................................................................................... 25 The sample client Notes.ini .......................................................................................................... 30 Appendix E  network configuration files .............................................................................................. 34 Network configuration .................................................................................................................... 35 Appendix F  guidelines for information usage ...................................................................................... 36 Appendix G  system pricing .............................................................................................................. 37 Appendix H  vendor statement........................................................................................................... 38 For more information ...................................................................................................................... 38 Appendix I  see attached file.
NotesBench test results
Section 1  executive summary Test sponsor This Lotus NotesBench testing was sponsored by Hewlett-Packard Company (HP). The NotesBench benchmark was developed and engineered by Lotus Development Corporation. Testing took place at the HP Microsoft Solutions  Lotus Engineering laboratories in Nashua, New Hampshire in January 2006 and was audited by KMDS Technical Associates, Inc., in January 2006. Test methodologies An HP ProLiant DL580 G3 server was configured to run a single test, NotesBench R6iNotes on the Microsoft Windows Server 2003, Enterprise Edition operating system. The NotesBench R6iNotes workload is designed to replicate an average group of users performing everyday web-based mail tasks. The test measured the performance of 18,500 users participating in Domino Web Access (DWA) related activities simultaneously connected to four Domino 7.0 Servers, on a single HP ProLiant DL580 G3 server. The test was set up using two groups of hardware: the driving systems, and the system under test (SUT). The SUT was configured with four Dual-Core Intel Xeon 3.0 GHz/667 MHz processors with Hyper-Threading enabled, 8 gigabytes of RAM, and two 36.4GB SCSI hard disks. The ProLiant DL580 G3 delivers high performance with up to four Dual-Core Intel Xeon 3.0 GHz/667 MHz processors available with 2x2MB L2 cache, up to 64GB of PC2-3200R 400MHz DDR2 memory and an embedded NC7782 Dual Port Gigabit Server Adapter standard. Two HP Smart Array 6404/256MB Controllers were installed, with a total of sixty four 36.4GB SCSI hard disks attached. Conclusions The R6iNotes workload was run to simulate18,500 R6iNotes usersagainst four Domino 7.0 Servers. The test ran eight hours during which the system under test achieved15,953 NotesMark(transactions per minute or tpm) with anaverage response time of 0.434 seconds. Based on these results, the price/performance ratio is$4.15/User (or $4.81/NotesMark).
3
Section 2  benchmark objectives
A question that most HP customers ask about Notes and Domino is: What is the maximum number of active Domino user sessions for our environment? The answer is determined by three criteria:
 The application profiling (what type of workload)  The hardware configuration (for example, the number of system CPUs or the amount of memory)  The operating system (OS)
The information provided in this report should help customers in their planning processes. With the publication of this and future full disclosure reports, HP and IBM\Lotus are working together to provide information that will help Notes and Domino users in their activities associated with performance tuning, enterprise network design, and many other areas. This report includes one workload:
 R6iNotes on Domino 7.0
All tests were performed on the Microsoft Windows Server 2003, Enterprise Edition platform; similar performance information for other hardware configurations is forthcoming.
4
Section 3  test methodologies
Test configuration The HP Performance Test Lab for Lotus Domino uses HP ProLiant BL10e systems configured with the Lotus Notes 7 client and NotesBench. Each ProLiant BL10e server has an 800 MHz processor, 40GB hard disk, and 1024MBs of RAM. For the testing of R6iNotes, nineteen ProLiant BL10e systems were used to create load. Additionally, an HP ProLiant DL360 system was deployed to act as the Parent system. All systems were connected over an isolated TCP/IP LAN using 100BaseT or 1000BaseT medium. All drivers were loaded with Microsoft Windows XP with Service Pack 2; the system under test was loaded with Microsoft Windows Server 2003, Enterprise Edition with Service Pack 1. Two changes were made to the HP ProLiant DL580 G3 server before the installation of the Operating System. In the ProLiant setup, both the hardware Prefetcher and Adjacent Sector Prefetch, which are on by default, were switched to off for HP testing. While the hardware Prefetcher and Adjacent Sector Prefetch do decrease the miss rate for the L2 cache, they dramatically consume bandwidth in the front side bus. Under heavy load, the bus reaches its capacity, causing large latencies in the memory operations, resulting in decreased total system capacity. By disabling the hardware Prefetcher and Adjacent Sector Prefetch, the L2 cache miss rate increases, but the throughput of the front side bus is kept at a more manageable rate, resulting in better scalability. All Microsoft Windows Server 2003, Enterprise Edition parameters were constant for the remaining benchmark sequences. Any Domino Server specific parameter changes are bold in Appendix Ds NOTES.INI files.
Test procedure For this workload, HP engineers first performed several trial runs to determine the best test duration and confirmation of steady state. Both test duration and steady state were determined using real time monitor utilities from Microsoft. During the trial runs, HP engineers observed the Windows Performance Monitor to determine the maximum number of users supported by the configuration. For this audit report, the decision was made to test only a single workload. Storage was provided using two HP Smart Array 6404 disk controllers connected to eight HP StorageWorks 30 Modular Smart Array (MSA30) storage enclosures, or four storage enclosures per Smart Array controller. Each storage enclosure was configured with eight 36.4GB Pluggable Ultra320 SCSI hard drives. Both Smart Array controllers were configured identically, providing four (4) logical disk units to the operating system. Each logical unit was comprised of sixteen 36.4GB disks spanning two storage enclosures in a RAID0+1 configuration. The Windows pagefile was configured to 12GB of RAM, and left on the logical C: drive, or OS partition. Four Domino partitions were then created and registered in the same Domino domain. Each of the partitions was then assigned to listen on a unique TCP/IP address on one of the two ports of the embedded NC7782 Dual Port Gigabit Server Adapter. Each of the four
5
partitions had an identical Domino directory that contained more then 40,000 registered users, and each Domino Server was configured to have six mail.box files. Each Domino server had its own copy of the Domino directory, and mail was routed between all four partitions for delivery. Testing began with the clients being added incrementally to the Domino environment using a Childstagger value setting at 15 minutes for the first 15 drivers, 30 minutes for drivers 16 and 17, and 70 minutes for drivers 18 and 19 as specified in the parent Notes.ini. The Client systems used a Threadstagger of 0.3 seconds for the first 17 drivers, and 0.4 seconds for the remaining two drivers, as specified in each client Notes.ini. Drivers 1 through 18 created 1000 R6iNotes users each, while driver 19 created 500 R6iNotes users. During the ramp up of this test, Driver1 stopped responding 5 hours and 15 minutes into the ramp up. Because the remaining 18 drivers had not fully completed ramping up, and the test only measures results from the point when all drivers have completed ramping to the shutdown of the test, Driver 1 was restarted and allowed to re-ramp its 1000 users. The test ran for eight hours after all client threads were connected to the system under test; over 90% of the mail generated during the run was delivered during the run. Steady state was determined to be achieved during each test run by monitoring server output, as well as mail routing, calendar and scheduling activities. The test data collected includes the following files for each partition:
 The SUT Notes.INI file and Log.nsf file  All clients Notes.INI file  The NotesNum utility results file  Stat Tables for each partition after run completion
6
Section 4  R6iNotes data Lotus NotesMark value for benchmarked configuration
Figure 1.NotesMark for R6iNotes Workload
18,000 16,000 14,000 12,000 10,000 8,000 6,000 4,000 2,000 0
R6iNotes Workload
15,953
HP ProLiant DL580 G3 - 4 X 3.0 GHz Dual Core
 NotesMark  transactions per minute or tpm
7
Response Time value for benchmarked configuration
Figure 2.Response Time for R6iNotes Workload
1.00 0.90
0.80 0.70 0.60 0.50 0.40 0.30 0.20
0.10 0.00
R6iNotes Workload
0.434
HP ProLiant DL580 G3 - 4 X 3.0 GHz Dual Core
8
R6iNotes users value for benchmarked configuration
Figure 3.R6iNotes Users
20,000 18,000 16,000 14,000 12,000 10,000 8,000 6,000 4,000 2,000 0
R6iNotes Workload
18,500
HP ProLiant DL580 G3 - 4 X 3.0 GHz Dual Core
9
CPU performance as recorded by Windows Performance Monitor
Figure 4.CPU performance Windows Performance Monitor
The failure of driver 1 is obvious in the CPU load during the ramp, but as the driver is restarted, and completes ramping, the system achieves steady state. The HP ProLiant DL580 G3 server averaged a CPU rate of 91% during the steady state of the test, with a maximum recorded value of 99.993% and a minimum recorded value of 82.194%.
10
Disk performance as recorded by Windows Performance Monitor
Figure 5.Disk performance Windows Performance Monitor
In this graph, the failure of driver 1 is indicated in the green line, which denotes the transfers per second of the logical disk that supports the Domino partition that driver 1 was creating load for. Note that after restart, the green line matches the pink and blue lines that denote the other two partitions supporting 5,000 users. The yellow line denotes the final partition, which is only supporting 3,500 users; this explains the lower transfer per sec rate.
11
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents