Time to Bench the Benchmarks?

Arthur Cole

So who has the fastest server in town? That age-old question is on the table once again following a series of claims and counter claims over benchmark tests last week. But perhaps the real question should be whether the benchmarks themselves have any real meaning anymore in the virtual/multicore age.


The subject of benchmarks came up following an announcement from IBM claiming that its System p 550 Express machine outperformed HP's PA-RISC system during recent TCP-C tests. In a bid to wrangle the nearly 170,000 PA-RISC users from HP's migration strategy to the Integrity line, IBM reported that an eight-core p550 using a 4.2 GHz Power6 processor and running a single instance of DB2 Enterprise 9.5 on AIX 5.3 with DS3400 Express storage handled 629,159 transactions per minute (tpmC), 16 percent more than a 64-core HP 9000 Superdome and 1.6 times more than an Itanium-2-based Integrity rx6600. IBM also noted that the p550 uses only 9 percent of the energy of the Superdome and takes up only 2 percent of the space.


Numbers being what they are, it wasn't long before the flaws in the comparison came out. As this article in the Register points out, the HP Superdome results in question were from 2003 with processors running at 875 MHz, a comparison the paper describes as "skinning corpses." And even the comparisons to the Integrity system overlooked the fact that they come at a cost of $2.49 per transaction, compared to $1.81 for HP.


Meanwhile, NEC is claiming new records for the TCP-E benchmark, used to calculate performance of customer-related accounts at, say, a brokerage firm. The company's NEC Express5800/1320Xf server using the S2500 storage system made 1,126.49 transactions per second (tpsE), a 70 percent improvement over the old record. The system used 32 dual-core Itanium 9150s running SQL Server 2008 on the Windows Server 2008 OS. That performance comes at a price of $2,771.79/tpsE.


Gaming benchmark results is a time-honored tradition in IT circles, and calls for reform have risen from time to time. But as Laurianne McLaughlin on CIO.com points out, there is a growing chorus of voices arguing that the very idea of industry benchmarks has become obsolete. With virtualization and multicore technology producing an explosion of designs and architectures, it's become nearly impossible to compare one server to another anymore. Many firms have begun establishing their own benchmarks to gauge the criteria they deem most important, whether that be transaction/application performance, power/density requirements, or a host of other measurements.


Love 'em or hate 'em, the existing benchmarks aren't likely to go away soon. But as the enterprise grows more diverse in its function and underlying technology, they may prove increasingly less reliable when determining system performance in the real world.

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


Add Comment      Leave a comment on this blog post
Mar 25, 2008 7:15 PM Mr Mo Jo Risin Mr Mo Jo Risin  says:
This is why ideasinternational's RPE-OLTP and RPE2 performance metrics exist. IBM openly publishes relative performance of their systems rPerf. HP has PQRT and Sun has MValues but refuse to give them to customers. Reply
Mar 28, 2008 10:10 AM ken zink ken zink  says:
Industry-recognized benchmarks (e.g., TPC and SPEC) are important because they provide a basis for comparison of competing systems. Vendor-specific benchmarks - like IBM's venerable LSPR for mainframes - can be useful for comparing the performance of systems of similar architectures from the same vendor but do not provide customers with a basis for competitive evaluation. (I don't think any of us really want to return to the 70's practice of a customer carrying their own application to all of the competing vendors' benchmark labs to evaluate their performance.) Arthur speaks of virtualization as blurring the value of benchmarks. I disagree. In fact, SPEC's imminent (I hope) new virtualization benchmark will provide even more light on the subject of the comparative performance of competitive virtualization offerings - both software and hardware platforms. As computer systems become more complex in their design as well as their productive use - as in virtualized environments - customers need more and better bases for competitive evaluation, not less. Reply
Apr 25, 2008 7:35 PM Starrdaark Starrdaark  says:
How many gamers are running a virtualized system? Reply
May 28, 2010 12:29 PM stephen bowhill stephen bowhill  says: in response to Mr Mo Jo Risin

thanks for the mention about RPE2.

Steve (Ideas International)


Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.

Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.