Paying to keep it running.
"Problem would be taking it apart, moving it to a new home, putting it back together and then paying to keep it running."
This --^
With the main problem being the paying to keep it running. 2.35 megawatts at 10 cents per kwh would come out to a $2,058,600 a year for power.
Moore's law suggests a similar capability super would now use about 1/8th the power.
I'll look at specs for IBM Sequoia now -- this eliminates some variables, both Sequoia and Roadrunner are from IBM (so it's not comparing a commercial system and a home-built...) and both going to DOE ($$$ markups anybody? $$$). Sequoia does about 16 petaflops in 96 racks... this suggests a (2012-era) 1 petaflop costs about $12 million (versus $100 million for Roadrunner). Power use would be about 450 kilowatts (so power is only about 1/5th instead of 1/8th.)
So, if someone got the system shipped and installed for free, it would be "cheaper" (if you can call $2 million in power cheap) in the short term.. But in 2 years, a 1 petaflop super (following Moore's law) should cost ~$6 million, and be down to about 1/10th the power of the original system (about $200,000 in power bills or so.) In about 5 years, you'd pay more in power in one year than it'd cost to buy a replacement system and the power to run it.
I think CPU/GPU hybrids may in fact run that price and power down several years ahead of schedule too.