The April 15th team- application deadline for the SC11 Student Cluster Competition (SCC) is fast approaching. The SCC pits eight university teams from around the globe against one another in Seattle (site of Supercomputing 2011 - or SC11) to compete for clustering glory. Sponsors supply the equipment and advice, but it’s the …
It's pretty simple...
... most researchers use supers rather than design supers.
Yeah it would be fun to go mess around with some hardware, but it's not going to help with whatever core research the person is doing.
Yeah, you'd think, but...
...the reality is different. Like you, I figured the teams would be all computer science types. But I was wrong, teams included mathematicians, bio-scientists, physics majors and students pursuing other science and research careers.
"The computer science programs at their universities gained mindshare from show attendees,"
mindshare? wtf is "mindshare"?
It's the kind of language you find yourself using when you've seen way way too many product pitches and have been thinking way to much about marketing strategy and theory. Be glad I didn't get talking about synergy and 'leveraging audience-appropriate brand assets' or some crap like that...
i turned a blind eye...
to the toe curling stereotyping, but...
"Maybe it’s just me, but I don’t know if any university can credibly claim a research heritage until they’ve competed at a Student Cluster Competition. "
requires a bit of a challenge
What, pray, do you think farday, newton, maxwell and 1000' of other possible slightly less noteworthy techies did with their lives work if not build a REAL research heritage ?
I'll give you a clue - they did not piss off to seattle in november to take part the 'technical leggo world series' (that other great world event that no one but a merkin gives a fuck about)
please someone open a window!
I have to disagree...
I'm not familiar with the random names you're citing above. Are they researchers or scientists? I'd love to friend them on Facebook if they're on it, or have them on my LinkedIn list.
Technical Leggo World Series? Hmm....interesting, I'm going to look into it and learn more...could be another event that The Reg should be covering. Thanks for the tip!
Might pop along
Maybe because Universities have things called mailing lists, and considering the amount of email people get in general about careers/conferences etc at these places people don't really feel the need to look elsewhere for even more.
So have you/they contacted Edinburgh? AFAIK here at the ICC in Durham we have had no publicity about the event -- this is the first time I've seen it. If we had been told then I'm sure some of us would come! I know a bunch of us are heading down Edinburgh for the DEISA/EPRACE Symposium at the end of this month.
However a good point is made by @AC: barely any of us know hardware in-depth: there's not time. It takes long enough to build a simulation and then there's GBs/TBs of data to analyse when you're done. This does seem to be aimed more at the computer science people who build/design supercomputers rather than the scientists/engineers/mathematicians that use them most of the time.
We have sysadmins and IT staff to install the new supercomputers, tune them, deal with email and network issues etc: so researchers only really play a part in procurement of new supercomputers - i.e. is X good enough for our simulation Y.
And this is generally cost-limited: the performance gains by tweaking our supercomputer to the maximum config (over what has already been done) would not be gigantic: for several months spent improving the system we might cut a 9 month job down to 8 or maybe 7.
Also the issue with these jobs is not that they take 8/9 months to run in the first place. It is that a simulation 10x the size takes 8/9years+ to run, and for it to be a feasible use of departmental computing resources it has to generally take less than a year or so to run. No one cares about running a simulation with 1.1x the resolution of somebody else: it has to be around an order of magnitude if you want to shout about it.
I think we could see larger improvements by making better use of the supercomputer and computer resources we already have - i.e. better batch queue systems, being able to use a bunch of idle workstations for smaller computations, better connections to other University systems so if we need 100 extra cores just for a week/month we can get them easily -- sadly UK Universities have not set up an on-demand grid to be shared between everyone. Still I guess we'll all be moving to platforms like Amazon's EC2 in the future which should improve this aspect.
Anyway I'll ask the folks around the dept if anyone's interested.
Better things to spend money on...
Whilst this sounds like a nice idea it's simply a lot cheaper for US institutions to take part in this. Finding funding for 6 students to travel to, and stay in, the US for a week is not easy. Given the financial squeeze on UK academia currently there are lots of better things to spend money on.
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip