back to article Colliders, containers, dark matter: The CERN atom smasher's careful cloud revolution

CERN made headlines with the discovery by physicists in 2012 of the Higgs boson, paving the way to a breakthrough in our understanding of how fundamental particles interact. Central to this was the Large Hadron Collider – a 26km ring of interconnected magnets chilled to -271.3C straddling the Franco-Swiss border. The LHC is …

  1. beardman
    WTF?

    in the control center photo

    do they really run WinXP or is that just the wallpaper..?

    1. hatti

      Re: in the control center photo

      It's just used to remind them what grass and sky is

    2. dukwon

      Re: in the control center photo

      They definitely used to. I don't think they do any more: it's probably an old photo.

    3. Mark 85 Silver badge
      Devil

      Re: in the control center photo

      This makes me wonder if they're running or will be running Win10... and how big of a pipe from CERN to MS is needed so they can phone home with all the data.

      1. dukwon

        Re: in the control center photo

        In the LHCb control room we used to have some Win7 machines mixed in with Scientific Linux machines, but now they all run CERN's own flavour of CentOS. At the CERN Control Centre (pictured) apparently the accelerator operator machines all run Linux now (presumably the same CentOS). There might be some Windows machines left for other purposes.

      2. TheVogon Silver badge

        Re: in the control center photo

        "and how big of a pipe from CERN to MS is needed so they can phone home with all the data."

        Presumably they would be using the corporate versions - which don't have most of the telemetry...

    4. as2003

      Re: in the control center photo

      I've found that photo being used in articles from as early as 2010, with the photo possibly taken on 2009-10-23. Which incidentally is 6 months after mainstream support for XP ended, but about 5 years before extended support ended.

  2. Anonymous Coward
    Anonymous Coward

    Don't forget distributed computing

    I wonder if they also manage the volunteer-run distributed computing system with OpenStack.

    It does use Virtualbox and connects to their Cern VM File System

    https://lhcathome.cern.ch/lhcathome/server_status.php

    Check it out if you'd like to donate some spare CPU cycles to LHC simulations.

  3. Anonymous Coward
    Anonymous Coward

    Code optimisation looks to be key here

    ""The way the physicists work is they build different analysis algorithms within a framework but one of the challenges is if the code's been written by a large number of people so that it optimises a large program – that requires breaking that down and finding the core algorithm and optimising them."

    You can't just keep throwing more costly CPUs and IOPs at legacy code to make it faster - optimising the code to exploit available compute and IO will have just as big return on investment as buying new hardware and 'containerising' legacy code to keep running in a modern environment.

    Coders typically assume that more hardware is the best fix for badly optimised code - it isn't. Being hardware constrained actually forces code to be efficient with the available resource.

    1. richardcox13

      Re: Code optimisation looks to be key here

      > Coders typically assume that more hardware is the best fix for badly optimised code

      We generally don't, But often – especially given other demands for resources – it is the least cost inefficient.

      1. Ogi

        Re: Code optimisation looks to be key here

        > We generally don't, But often – especially given other demands for resources – it is the least cost inefficient.

        Indeed, back in the early days of computers, computing power was more expensive than programmer time, so it made sense to get programmers to spend a lot of time to optimise their code to the limit to get the most power out of the machine. Hence you saw amazing stuff done with what we today consider an impossibly small amount of RAM and CPU power.

        However now that has been inverted. Computing power is a lot cheaper than programming time, so sometimes "just throw more hardware at the problem" is the right answer. In fact it seems to be the more cost effective choice pretty much everywhere (Except embedded and aerospace industries, and to a lesser extent the HFT Finance area).

  4. Anonymous Coward
    Anonymous Coward

    So where

    Do they hide the roomful of 8086 chips still in their original plastic packages in 20+ year old antistatic bags, purchased from NASA at a very good NOS price?

    Even CERN has to use old tech, due to truly mind boggling (lethal dose in seconds) radiation at the four major detectors ie ALICE, LHCb, CMS, etc.

    Some of those chips have to be decades old but are still working, they are incidentally also used in physics packages albeit the ceramic packaged variant.

  5. Yesnomaybe

    Goodness...

    Puts my crappy little server-room into perspective, doesn't it just...

  6. Louis Schreurs BEng

    not part of the solution

    spotted unreadable/wrong/silly speel erors, or something like that, and the button for tips and corrections does nothing

    "However, there's also and understanding of the cost."

    the Helix Nebula imitative

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019