What this article needs...
...is definitely more acronyms.
There's a new big data benchmark in town: TPC-DS. The Transaction Processing Performance Council still doesn't know how to do its own abbreviation after 24 years of existence, but it does know a thing or two about getting IT hardware and software vendors together and hammering out benchmark tests and pricing metrics to help …
TPC.org takes a much more direct approach to benchmark testing.
Different workloads will always exist (OLTP, DSS, BW, that new thing now Big Data etc.).
The problem I have found with TPC in viewing the results as well building platforms for testing results, it is a litte too consensus drive. While there are specific TPC rules, there are also vendor imposed rules/guidelines.
TPC needs to have star-chamber made up of clients that determine what/how/when something is published.
A sub group of Hardware Vendors ensure HW and OE parameters are equal. ( very often seem to test futures and possible features that are not available yet to obtain the largest possible benchmark score or the lowest cost per transactions. This leads to platforms that are not realistic.
I have yet had a customer ask me for a configuration that looks like the anyting on the any of the TPC-[C, E or H] Top Ten (performance, price/performance) lists.
Database/Application Vendors - you are silent voting members to help provide guidance - All results can be published without our opinion.
All results can be challenged by contacting the aforementioned Star Chamber.
In the current form -
TPC-E/C are dead. Oracle does not scale well on TPC-E benchmark - You wont see one from them. Only benchmarks of EBS Applications.
MS SQL Server does not scale well on TPC-C - You wont see one from them.
DB2 - only IBM gear (or reseller) is allowed to benchmark DB2.
Sybase - cant think of the last time a current benchmark was done with Sybase.
Get the DB/App vendors out of the way - use standardized tests so that benchmarks are relevant to the buyers.