back to article Cambridge’s HPC-as-a-service for boffins, big and small

Cambridge University has been involved in high performance computing for 18 years, starting with a “traditional” supercomputer, before a major restructuring eight years later led to what would now be considered high-performance computing (HPC). We read much about how IT needs to become a profit not a cost center. Well, as part …

Anonymous Coward

Data Centre

That shiny new data centre wasn't built exclusively for the HPC crowd. Lots of other University people have, or will have, kit in the building.

0
0
Anonymous Coward

Re: Data Centre

Indeed, the new West Cambridge Data Centre (WCDC) has four halls, of which the HPC lot are only using one. Furthermore, there are lots of groups ploughing there own furrow, with many server rooms dotted around various departments hosting some pretty large clusters.

1
0
Holmes

Funny how little has changed

First photo: Respected boffin and guy-who-does-the-real-work pose in front of impressive hardware

Second photo: Suits pose in front of impressive hardware (guys*-who-do-the-real-work noticeably absent)

* intended to include female "guys" as well

7
0

Re: Funny how little has changed

<snip>

"* intended to include female "guys" as well"

Funny chaps women {1.}.

1. Quote from an eighties TV programme.

3
0

Well, with ATLAS, f'rinstance, producing raw data at 1PetaByte/s (after zero-suppression), you can forget storing anywhere near all the data from major scientific experiments for a long time into the future, even if storage reaches "commoditised" prices (I thought it already had, but never mind).

So, by extension, we'll be "throwing away" data pre-determined as "uninteresting" for a long while yet. Probably forever, as we can pretty much guarantee that the experiments will be producing more data at a faster rate than affordable (or even feasible) storage size increases

0
0
Silver badge

" "throwing away" data pre-determined as "uninteresting" for a long while yet. Probably forever"

Probably !. It'd be nice to know how much this huge data stream has already been processed (FPGAs, heuristics etc) at the experiment . Anyone know ?

0
0
Silver badge

>"commoditised" prices

As a grad student in the 90s, when the price of an external 1Gb SCSI drive for a SUN dropped to 1000quid we got one for every machine. No more booking scratch disk space in advance and reading from tapes, processing data them and writing back to tape, we would have all the disk space we needed.

1
0
Silver badge

>" "throwing away" data pre-determined as "uninteresting" for a long while yet. Probably forever"

The detectors have layers of processing just because it's impossible to get the raw data out of the experiment fast enough (there isn't enough space for the cables)

So each layer tries to throw away as much as possible and only passes interesting events up the chain. When they were first designing ATLAS I know they were planning on only sampling a fraction of even the final data stream because they couldn't keep up.

The big problem of archiving all the results is the associated knowledge/context.

You can keep all the configuration/calibration info linked to the data and design the data formats to be future proof - but it's harder to capture the institutional knowledge of "those results were taken when we were having problems with the flux capacitor so anything above 1.21GW is probably a bit dodgy, Fred knows about it but he left".

In HEP it rapidly becomes cheaper/easier to re-run the experiment with the newer kit rather than try and wade through old stuff.

In astronomy it's different, we can't rerun the universe so we religiously save every scrap of imagery.

1
0
Alert

Meanwhile in Life Sciences

Running another analysis requires the samples are kept in a viable state and storing stuff like that makes digital storage look the easy option.

1
0

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2017