It would be good to attack this from the software side too. Many analysis tasks have become too complex to implement with hand-crafted assembly language, or hand-crafted with anything. What happens is that many large and complicated frameworks are tied together with a relatively small amount of custom code. Each framework has a formal representation of data inputs and outputs that are each padded with protection against accidental misuse that would cause obvious data corruption. All of this formality and safety can end up being an enormous processing overhead. "Enterprise Edition" software is the classic example of nearly infinite inefficiency, but seemingly low-level tasks suffer too. What would be useful would be a radical new generation of JIT compiler that can make extreme optimizations across an entire system; analyzing enormous codebases and producing minimal hardware instructions to produce the correct result. Given that an entire data center is available to perform the analysis, it could be feasible.