Re: QLC? It's not the one for me
Digital is a huge waste of power and bandwidth. It takes about 7 parallel digital circuits to match the precision of one analog circuit. When it comes to mathematics, digital needs massive gate arrays and microcode to perform the same task as a handful of analog components. Propagation delay hits big digital circuits pretty hard and workarounds further increase complexity. Analog computers are still alive and well for any time speed and efficiency is more important than precision.
I suspect the AI singularity will happen when analog and digital processors are efficiently merged together. Last time I read about it, flash cells were going to be the parameter buffers between the two.