"Then why the push for ARM on servers? Servers are probably one of the areas of computing that happens to be MORE demanding than gamers"
The workloads we put onto servers are often quite different to those of gamers.
To use a web or application server as an example, the majority of the work is being done using non-complex CPU instructions, the workload is mostly repetitive and, more often than not, is not architecture-specific. For this kind of work, ARM chips are fine - you can take comparatively inexpensive ARM hardware and ramp up the density hugely without consuming much more electricity and that's fine for generic server workloads. That's exactly what HP did with the Moonshot systems.
A lot of computer games aren't general-purpose compute applications. They are far more sensitive to architecture-specific optimisations and countless extended CPU instructions, not to mention memory bandwidth, bus speeds, etc.
Maybe Sony or Microsoft will start building consoles with ARM chips, but that doesn't bring us any closer to a "one-size-fits-all" ARM machine. They're going to have to make big changes and compromises to squeeze out the kind of performance they will want or need. We will just end up with high-powered-power-sucking-ARM vs low-powered-battery-sipping-ARM.
Sounds familiar - ah yes, Xeon vs Atom.