* Posts by melindamcdade

1 post • joined 28 Sep 2011

HPC 2.0: The Monster Mash-up


Big Data has a variety of different meanings. It can be a large data warehouse that you run data mining and analytics on. It could be an extremely large corporate database for on line transaction processing. It could be large unstructured or file system data that is managed through database meta data. Regardless, everyone's data repositories are growing exponentially to capture knowledge (lessons learned, etc) and accommodate ever emerging data acquisition technologies. The uses of data are also expanding rapidly for making key business decisions based on multidisciplinary data.

In my experience, if you want the most optimal application performance, you need to make sure the data operands are either in a core's register file or on chip cache. HPC to me means optimizing data movement using multiple level buffering from disk to SSD to memory to L3 cache to L2 cache to L1 cache so the processor cores do not stall waiting for data. Big Data means more effective data management, in-memory grids, large fast memory, and optimizing data access, movement, integration, processing, and real time delivery of results.

My 2 cents



Biting the hand that feeds IT © 1998–2017