Just for comparison (on the computing front), JET used an IBM 3090 for data processing. It needed to get all the processing done in about half an hour as that is how frequently plasmas were run. There was often some preprocessing done on Solaris boxes nearer to the diagnostics but that couldn't have access to the other data (magnetic field strengths etc) that was necessary for full processing.
My memory says that in the 90s it was producing 30GB of data per pulse but don't rely on that data...
As for actually supplying energy. JET demonstrated "Breakeven" which means that the fusion reaction generated more energy that it took to create and sustain the plasma. But that is before the efficiencies of conversion. Then there is the "Lawson criterion" which means that you take into account the electrical generation efficiencies and the efficiency of the equipment supplying the power (microwaves or neutral beams). Finally there is ignition where you kick it off and the plasma generates enough power internally to sustain its own reaction. This means that anything you collect is usable and once you have gone beyond the amount you needed to create the plasma you are actually generating something.
ITER was originally designed to achieve ignition but I think this was later deemed impossible.